
Citation: Ryan Anthony J. de Belen, Huyen Nguyen, Daniel Filonik, Dennis Del Favero, Tomasz Bednarz. A systematic review of the current state of collaborative mixed reality technologies: 2013–2018[J]. AIMS Electronics and Electrical Engineering, 2019, 3(2): 181-223. doi: 10.3934/ElectrEng.2019.2.181
[1] | David B. Douglas, Robert E. Douglas, Cliff Wilke, David Gibson, John Boone, Max Wintermark . A systematic review of 3D cursor in the medical literature. AIMS Electronics and Electrical Engineering, 2018, 2(1): 1-11. doi: 10.3934/ElectrEng.2018.1.1 |
[2] | Tobias Müller . Challenges in representing information with augmented reality to support manual procedural tasks. AIMS Electronics and Electrical Engineering, 2019, 3(1): 71-97. doi: 10.3934/ElectrEng.2019.1.71 |
[3] | Rebecca M. Hein, Carolin Wienrich, Marc E. Latoschik . A systematic review of foreign language learning with immersive technologies (2001-2020). AIMS Electronics and Electrical Engineering, 2021, 5(2): 117-145. doi: 10.3934/electreng.2021007 |
[4] | Patrick Seeling . Augmented Reality Device Operator Cognitive Strain Determination and Prediction. AIMS Electronics and Electrical Engineering, 2017, 1(1): 100-110. doi: 10.3934/ElectrEng.2017.1.100 |
[5] | Fadhil A. Hasan, Lina J. Rashad . Combining fractional-order PI controller with field-oriented control based on maximum torque per ampere technique considering iron loss of induction motor. AIMS Electronics and Electrical Engineering, 2024, 8(3): 380-403. doi: 10.3934/electreng.2024018 |
[6] | Folasade M. Dahunsi, Abayomi E. Olawumi, Daniel T. Ale, Oluwafemi A. Sarumi . A systematic review of data pre-processing methods and unsupervised mining methods used in profiling smart meter data. AIMS Electronics and Electrical Engineering, 2021, 5(4): 284-314. doi: 10.3934/electreng.2021015 |
[7] | Deven Nahata, Kareem Othman . Exploring the challenges and opportunities of image processing and sensor fusion in autonomous vehicles: A comprehensive review. AIMS Electronics and Electrical Engineering, 2023, 7(4): 271-321. doi: 10.3934/electreng.2023016 |
[8] | Professor Peter Chong . SmartGift 2018 — Mobile and wireless technologies for sustainable mobility and transportation system. AIMS Electronics and Electrical Engineering, 2018, 2(4): 131-132. doi: 10.3934/ElectrEng.2018.4.131 |
[9] | Muamer M. Shebani, M. Tariq Iqbal, John E. Quaicoe . Comparison between alternative droop control strategy, modified droop method and control algorithm technique for parallel-connected converters. AIMS Electronics and Electrical Engineering, 2021, 5(1): 1-23. doi: 10.3934/electreng.2021001 |
[10] | Efe Francis Orumwense, Khaled Abo-Al-Ez . Internet of Things for smart energy systems: A review on its applications, challenges and future trends. AIMS Electronics and Electrical Engineering, 2023, 7(1): 50-74. doi: 10.3934/electreng.2023004 |
Abbreviations: HHD: Handheld Device; HMD: Head-mounted Display; PC: desktop/laptop; SAR: Spatial Augmented Reality
While there are already existing surveys of Augmented Reality (AR) user studies [1,2,3,4], there is still no systematic review of the current state of collaborative Mixed Reality (MR) research to the authors' knowledge. To help the MR community determine opportunities for future research, this paper provides a high-level overview of collaborative MR studies from 2013 to 2018. The goals are to provide readers with the overview of collaborative MR research, highlight papers which are making an impact in their respective application areas, identify research gaps, and layout interesting research directions. Through this, researchers might be encouraged to explore and pursue research directions which will further develop and improve collaborative applications in MR. Most importantly, it summarises how collaborative MR environments have changed and improved the communication and interactions of collocated and remotely placed users, and how MR environments might change the future of research discovery, exploration, and informed decision making across several research disciplines.
Section 1 touches on the reality-virtuality continuum, describes the characteristics of collaboration in MR, and explains the motivations for developing this technology. Section 2 provides a high-level overview of the reviewed papers categorised based on application areas, types of display devices used, collaboration setups, and user interaction and experience aspects. Section 3 details the current state of collaborative MR research in each application area and highlights remarkable papers which are making an impact in their respective application areas. Section 4 focuses on user interaction and user experience aspects, which are gaining importance as previous problems (i.e. tracking and registration) are now being solved with the advances in technology. Section 5 describes research gaps in some areas and directions that require further research. Finally, Section 6 concludes the paper with a summary.
Figure 1 depicts the reality-virtuality continuum [5]. Reality is the perception of the real environment without the use of any technology. Augmented Reality (AR) is a technology which allows near-seamless, real-time integration and interaction of virtual and physical objects in the environment [6]. Augmented Virtuality (AV) captures real objects and integrates them into a virtual environment. Virtual Reality (VR) is a technology which immerses users in a complete virtual, often computer-generated environment to simulate an experience. This paper focuses on technologies that operate somewhere in between real environments and complete virtual ones and are collectively called Mixed Reality (MR) [5,7]. Recent advances in display technologies are driving down the cost of MR devices which were previously only available within specialised facilities. As a result, there has been a significant adoption of MR devices across different application areas, such as Architecture, Engineering, Construction, and Operations (AECO) [8], Education and Training [9], Industrial [10], and Entertainment and Gaming [11].
Although recent MR developments have been proven valuable in single user applications [1], collaborative MR applications present a promising area of research. It is believed that collaborative MR applications addresses two major issues: seamlessness and reality enhancement, which are usually present in traditional collaborative computer applications [12]. There are existing taxonomies that divide collaborative MR systems into different dimensions [13,14,15]. In collaborative MR systems, the interaction space is an important dimension. It is subdivided into collocated setup where users are in the same location, remote setup where users are in different locations, and variable setup where different collocated users can collaborate with remote users. The temporal reference of the interaction is another crucial dimension. This differentiates systems which are limited to the use of synchronous interactions where users are present at the same time and which are usable for asynchronous interactions where users are present at different times. In this paper, a clear focus on the interaction space and temporal reference are given to the reviewed papers.
For the past years, industries and research domains have been rapidly growing in terms of scale, complexity, and interdisciplinarity. Thus, complex problems now require more knowledge than any single person possesses because the experience and expertise relevant to a problem is usually distributed among different professionals [16]. Bringing different and sometimes opposing points of view together to create a shared understanding of the problem among different stakeholders can lead to new insights, innovative ideas, and interesting artefacts. With the advances in information and communication technologies (ICTs), shared environments that facilitate multidisciplinary collaboration are now being developed.
Mixed Reality (MR) enhances a user's perception of the real world and allows near-seamless interaction between virtual objects and the real world. Information overlay allows remote collaborators to annotate the user's view and it enhances communication between collaborators by providing visual, auditory, or haptic cues. In this way, supporting collaboration in MR applications provides even greater advantages, especially when distributed stakeholders are necessary to complete a certain task. To help the MR community determine opportunities for future research, this survey provides a systematic review of the current state of collaborative MR research published from 2013-2018. It includes researches in different application areas that utilise different display configurations, collaboration setups, and user interaction and experience aspects (see Section 3). While the application areas are not exhaustive, they do cover the areas explored so far.
This section presents the research methods employed to carry out the systematic review process. Section 2.1 details the search process done to collect papers for this review and Section 2.2 explains the review process employed to ensure that the papers follow our review criterion.
This systematic review was made as inclusive as practically possible. Using the search terms: (ⅰ) "Augmented Reality" AND "Collaboration" and (ⅱ) "Mixed Reality" AND "Collaboration", we collected papers from the Scopus bibliographic database, IEEEXplore, and ACM Digital Library. The search for the terms was made in Title, Abstract, and Keywords fields. All search results published in conferences and journals between 2013 and 2018 were taken into consideration.
The search results were scanned individually to identify whether or not it supported collaboration in MR. Only 259 papers satisfied the criterion. The number of citations received by each paper to date was obtained through Google Scholar and was used to determine its impact on each application area. The papers which received the highest citation count were discussed to understand exemplary applications from each domain. An Excel spreadsheet was created in order to systematically keep track of all the reviewed papers. The reviews of each paper focused on the following attributes:
● application areas and keywords,
● types of display devices used,
● user collaboration setups, and
● user interaction and experience aspects.
Although a considerable amount of effort has been given to be systematic and thorough during the selection and review process, there are still limitations with the described method. Although multiple bibliographic databases have been used to cover a wide range of publication venues and topics, it remains likely that there are papers which were not included in this review. In addition, the search terms used might be limiting, as other papers could have used different keywords to describe "Augmented Reality" and "Mixed Reality".
Overall, 259 collaborative MR papers published between 2013 and 2018 were reviewed and categorised. Figure 2a categorises the papers into application areas. Figure 2b analyses the types of display devices used in collaborative MR applications. Figure 2c summarises the factors usually considered to establish collaboration in MR. Figure 2d shows the user collaboration setups in the reported papers.
The papers were categorised into different application areas (see Figure 2a): (ⅰ) Architecture, Engineering, Construction, and Operations (AECO) (21 papers, or 14%), (ⅱ) Education and Training (42 papers, or 29%), (ⅲ) Entertainment and Gaming (40 papers, or 28%), (ⅳ) Industrial (22 papers, or 15%), (ⅴ) Medicine (12 papers, or 8%), and (ⅵ) Tourism and Heritage (8 papers, or 6%). The application areas emerged from the inductive analysis of the obtained papers. Alternatively, there were a number of papers which did not fall into any of the application areas mentioned above but were still included in this review because their focus was primarily on user interaction and user experience aspects. Figure 2a shows the change over time in the number of collaborative MR papers in these application areas. Education and Training, Entertainment and Gaming, and AECO seem to benefit from collaborative MR systems as they constituted more than half of the reviewed papers. Although there were fewer studies in Medicine and Tourism and Heritage, recent researches in these application areas already showed promising benefits of collaborative MR systems, suggesting a need for further studies. Finally, the analysis shows that there is an increasing interest in applying collaborative MR systems in certain application areas. These will be discussed in a separate section of the paper (see Section 6).
The types of display devices used in the papers were also recorded (see Figure 2b). A significant number of papers (65 papers, or 25%) reported using head-mounted displays (HMDs), such as Oculus Rift, HTC Vive and Microsoft HoloLens, for their collaborative MR setup. This was followed by the use of a combination of different types of display devices that were mentioned in this subsection (60 papers, or 23%) and handheld displays (HHDs) (58 papers, or 22%). 38 papers (or 15%) reported using at least one kind of PC display (desktop and laptop) and either HMDs, HHDs, or a projector system. Our survey shows that desktop displays are often used by remote collaborators as they are often stationary and require more computing power to guide or assist a local user. There is an increasing interest (22 papers, or 8%) in using spatial AR (SAR) through projector systems to enhance collaboration. This might be due to the fact that using SARs affords users the ability to view and interact with digital information without being tethered to a display device. Finally, there were relatively few papers (16 papers, or 6%) which used a combination of HHDs and HMDs.
The types of user interaction and user experience in collaborative MR were also considered. These were categorised as: (ⅰ) annotation techniques (89 papers), (ⅱ) cooperative object manipulation techniques (112 papers), and (ⅲ) user perception and cognition studies (55 papers) (see Figure 2c). Annotation techniques allow instructions and directions to be overlaid in the environment. They were found to be useful and effective in providing information to users during collaboration (as a 'guiding voice') [17]. Cooperative object manipulation techniques in collaborative MR environments were determined to be a promising way to decrease completion times of various tasks as multiple users can manipulate the same object at the same time [18]. There is also a need to handle privacy issues and view management of virtual objects in collaborative MR [19]. Finally, user perception and cognition studies, which aim to lessen cognitive workload and increase users' perceptual (e.g. situational, social, and task) awareness and presence, were also found to be an important factor to foster collaboration in MR environments [17].
The collaboration setups used in the papers were also reported. Users can be remote in different locations as they collaborate in MR. Alternatively, users can be collocated during collaboration. Finally, a combination of both setups was observed in a number of papers. Figure 2d shows the number of papers across different collaboration setups. 129 papers (or 50%) used a remote collaboration setup, 103 papers (or 40%) used a collocated collaboration setup, and 27 papers (or 10%) used a variable setup with both collocated and remote collaborations. This analysis shows that there is an unexplored area for collaborative MR systems which can support both collocated and remote collaboration setup. While there have been numerous research studies on synchronous collaborative MR applications which require users to be present at the same time, there has not been the same interest in pursuing asynchronous MR applications.
In this section, the reviewed papers are categorised based on their respective application areas. In addition, the types of display devices used and collaboration setups for each paper are discussed. A high-level overview of the work done in each application area is also provided. Research papers that are making an impact in their respective areas are highlighted. Finally, interesting research directions for each application area are laid out.
Architecture, Engineering, Construction, and Operations (AECO) were the main application areas of twenty-one papers. Expert coordination and discussion are found to be one of the keys to success for all projects in this application area [20]. For every design phase, architects, engineers, and subcontractors need to work closely to meet the requirements and deadlines. By incorporating experience and expertise of different professionals, the project team can realise high-quality decisions and innovations [21]. It is also evident that group discussions contribute to a better and more efficient outcomes when compared to individual decision making [22]. As projects become enormously complex, certain disciplines become increasingly specialised, and immense amounts of information are included during meetings, current researches are exploring a multidisciplinary and multi-organisational discussion environment that can support the planning process between project teams. The recent advances in collaborative MR technologies have the potential to offer new opportunities to provide such environments by offering immersive experiences, physical embodiment, and immediate feedback to its users that would be difficult to obtain through traditional design media [23]. Fourteen papers used a collocated MR setup, four used a solely remote MR setup, and three used a variable remote and collocated MR setup. Five papers used HHDs, five used HMDs, five used a combination of different display devices, three used a PC and HHDs or HMDs, one used HHDs and HMDs, and two used SARs (see Table 1).
References | Topic | Display Devices Used | Collaboration Setup |
CasarinPacqueriaud and Bechmann [25] | Construction | PC + HMD | Collocated |
Coppens and Mens [26] | Architectural modelling | HMD | Variable |
Cortés-Dávalos and Mendoza [27] | Layout Planning | HHD | Collocated |
CroftLuceroNeurnberger et al. [28] | Military Operations | HMD | Collocated |
DongBehzadanChen et al. [29] | Visualisation | HMD | Collocated |
ElvezioLingLiu et al. [30] | Urban Data Exploration | HMD | Collocated |
EtzoldGrimmSchweitzer et al. [31] | Construction | PC + HHD | Remote |
Flotyński and Sobociński [32] | Urban Design | Combination | Collocated |
GülUzun and Halıcı [23] | Design Planning | Combination | Variable |
IbayashiSugiuraSakamoto et al. [33] | Architecture Design | Others | Collocated |
LeonDoolanLaing et al. [34] | Computational Design | Touchscreen Display | Collocated |
LiNee and Ong [35] | FE Structural Analysis | HHD | Collocated |
LinLiuTsai et al. [20] | Construction Discussion | HHD + Public Display | Collocated |
NittalaLiCartwright et al. [36] | Field Operations | HHD + HMD | Remote |
PhanHönig and Ayanian [37] | Operations | HMD | Remote |
Rajeb and Leclercq [38] | Architectural Design | SAR | Variable |
RoKimByun et al. [39] | Architectural Design | SAR | Remote |
SchattelTönnisKlinker et al. [40] | Architectural Design | HHD | Collocated |
ShinNg and Saakes [41] | Interior Design | HHD | Collocated |
Singh and Delhi [42] | Layout Planning | HHD | Collocated |
TroutRussellHarrison et al. [43] | Military Operations | PC + HMD | Collocated |
LinLiuTsai et al. [20] provided an example of how collaboration in MR can create a visualisation environment to facilitate discussion processes during project meetings. They employed AR technologies to display public and private information. Public information can be viewed in a stationary display, called a Building Information Modelling (BIM) Table, while private information can be viewed using mobile devices. By viewing public and private information separately, users can clearly grasp the whole picture of the construction project. In addition, the complexity of discussion-related information was reduced while keeping the necessary information available during the project meeting. The authors conducted a comparison test (with 36 participants) between their system and the conventional paper-based method to validate how users can benefit from their system. It was found that the completion time was significantly shortened using their system in both data-finding and problem prediction during the discussion process.
Even with a high amount of planning and coordination efforts during meetings, misunderstandings which can result in a decrease in efficiency and cause enormous additional costs and delays are still likely [24]. Due to the increasing complexity of projects, relevant information and important authority are often distributed across multiple locations and parties. Although it is ideal that experts are all present during project meetings, it might not always be the case due to varying schedules. In order to avoid the aforementioned problems, current researches are utilizing MR devices to support and improve coordination, discussion, and collaboration between different AECO experts. MR allows users to see digital information, such as construction plans, design sketches and blueprints, and 3D Computer Aided Design (CAD) models in a shared environment. In addition, MR allows contextual information to be placed relative to the user's location in the environment. As a result, different experts can actively participate in discussions about strategies to meet the deadlines, search for information more efficiently, and foresee potential problems faster. It will be interesting to investigate asynchronous collaboration in this application area. Asynchronous collaboration allows users to create and retain digital information which can be used for later consumption. This provides an opportunity for members to revisit previous project meetings and brainstorming sessions while keeping annotated documents visible. Through this, team leaders can monitor their teams' progress and members can review the project timeline. Finally, it will be interesting to see different AECO firms remotely collaborating in MR environments during project meetings.
Education and Training are well-explored application areas in collaborative MR research. Forty-two papers (29%) focused on this application area. As expected, all studies reported teaching applications with a few studies focused on areas like military and sports training. A majority of the papers focused on improving learning through collaborative engagement and participation in subjects like natural science, history, computer science, and mathematics. Thirty-five papers used a collocated setup, six papers used remote setup, and one used a variable setup. Eighteen papers reported using HHDs, nine papers used a combination of different display devices, six used SARs or projector system, four papers used a combination of desktop computers and HHDs, HMDs or SAR, three used HHDs and HMDs, and two papers used HMDs (see Table 2).
References | Topic | Display Devices Used | Collaboration Setup |
AlhumaidanLo and Selby [44] | Learning | HHD | Collocated |
AlhumaidanLo and Selby [45] | Learning | HHD | Collocated |
BenavidesAmores and Maes [46] | Experiential learning | HMD | Remote |
Blanco-FernándezLópez-NoresPazos-Arias et al. [47] | Immersive learning, human history | HHD | Collocated |
BoyceRowanBaity et al. [48] | Military training | SAR | Collocated |
Bressler and Bodzin [49] | Learning, science forensic game | HHD | Collocated |
ChenFan and Wu [50] | Learning, horticultural science | HHD | Collocated |
DaiberKosmalla and Krüger [51] | Boulder training | HHD | Collocated |
DesaiBelmonteJin et al. [52] | Training, chemistry experiments | PC | Remote |
Fleck and Simon [53] | Learning, astronomy | SAR | Collocated |
Gazcón and Castro [54] | Learning | PC | Variable |
GelsominiKanevHung et al. [55] | Learning, Kanji language | HHD | Collocated |
GironacciMc-Call and Tamisier [56] | Storytelling, gamification | HHD + HMD | Collocated |
GoyalVijayMonga et al. [57] | Learning, programming | HHD | Collocated |
Greenwald [58] | Situated Learning | HHD + HMD | Remote |
HanJoHyun et al. [59] | Learning, dramatic play | PC | Collocated |
Iftene and Trandabăț [60] | Learning | HHD | Collocated |
Jyun-Fong and Ju-Ling [61] | Learning, local history | HHD | Collocated |
KangNoroozOguamanam et al. [62] | Embodied interaction | SAR | Collocated |
KazanidisPalaigeorgiouPapadopoulou et al. [63] | Learning, interactive videos | HHD + SAR | Collocated |
KeifertLeeDahn et al. [64] | Children behaviour during collaborative activities | SAR | Collocated |
Kim and Kim [65] | Learning, English education | HHD | Collocated |
KrstulovicBoticki and Ogata [66] | Learning | HHD | Collocated |
LeLe and Tran [67] | Learning | HHD + HMD | Collocated |
MacIntyreZhangJones et al. [68] | Learning, programming | SAR | Collocated |
MalinverniValeroSchaper et al. [69] | Embodied Learning | HHD | Collocated |
MaskottMaskott and Vrysis [70] | Learning, gamification | Combination | Collocated |
Pareto [71] | Learning, arithmetic games | Combination | Collocated |
PetersHeijligersde Kievith et al. [72] | Leadership training | HMD | Collocated |
PunjabiTung and Lin [73] | Learning by exploration | PC + HHD | Remote |
Rodríguez-VizzuettPérez-MedinaMuñoz-Arteaga et al. [74] | Learning | Others | Collocated |
Sanabria and Arámburo-Lizárraga [75] | Learning | Combination | Collocated |
ShaerValdesLiu et al. [76] | Experiential learning | Others | Collocated |
Shirazi and Behzadan [77] | Education, Construction | HHD | Collocated |
Shirazi and Behzadan [78] | Education, Construction | HHD | Collocated |
SunLiuZhang et al. [79] | Teaching | PC + HMD | Remote |
SunZhangLiu et al. [80] | Teaching | PC + HMD | Remote |
ThompsonLeavyLambeth et al. [81] | Education | HHD | Collocated |
WiehrKosmallaDaiber et al. [82] | Training, climbing | SAR | Collocated |
YangguangYue and Xiaodong [83] | Training | HHD | Collocated |
YoonWang and Elinich [84] | Learning | PC + SAR | Collocated |
ZubirSuryani and Ghazali [85] | Learning | HHD | Collocated |
Bressler and Bodzin [49] investigated factors affecting student engagement during a collaborative mobile AR learning game. They conducted their study using a mixed method approach through pre- and post-surveys, field observations, and group interviews with 68 urban middle school students. The sample population included 35 male (51.5%) and 33 female (48.5%) students, aged between 11 and 15 years old. Students teamed up in groups of 3 or 4, and played a forensic science mystery game where they analysed fingerprints, hairs, and other trace evidence. During the game they collaboratively solved investigative problems such as decoding locker combinations and determining suspects' intentions. The findings demonstrated that a collaborative mobile AR learning game increased interest in science, made learning fun and enjoyable, and facilitated teamwork and engagement as students learned from each other.
Collaborative MR environments were extensively used in the education and training application areas. By presenting learning content in 3D perspective, MR educational applications can provide an environment where difficult and complex subjects, such as engineering concepts, are easily taught [78]. This feature adds valuable help especially to the applications which support constructivism that require authentic context [86]. All MR applications support collaborative and situated learning to gain a social flavour, provide high interactivity, and increase students' engagement into learning activities. The work done in education is mostly directed towards collocated collaboration where students collectively learn from active participation in the group. In addition, learning through gamification is seen to be an innovative way to promote engagement in learning. Research on the interaction between users and virtual objects in MR environments are important to collaborative training. Different interaction techniques have already been developed but the most effective one is real hand interaction [87]. By utilising the affordance provided by the human hand, users can manipulate virtual objects quickly and precisely, with little conscious attention [88]. In addition, annotations is effective to convey spatial information as compared to using arrows or pointers [89].
There was a total of forty papers reviewed in this application area. A majority of the reviewed papers reported how collaborative MR was used to play games, such as solving a 3D jigsaw puzzle, competing in board games, and consuming multimedia content. Twenty-four papers reported using a collocated collaboration setup, twelve papers used a remote setup, and four papers used a variable setup. Thirteen papers used a combination of different display devices, eight paper used HMDs, six papers used a combination of PC and either HHDS or HMDs for their setup, five papers used SARs, four papers used both HHDs and HMDs, and four papers used HHDs (see Table 3).
References | Topic | Display Devices Used | Collaboration Setup |
Akahoshi and Matsushita [92] | Game | Others | Collocated |
BaillardFradetAlleaume et al. [93] | Media consumption | HHD + HMD | Collocated |
Baldauf and Fröhlich [94] | Media consumption | HHD + Public Display | Collocated |
BallagasDuganRevelle et al. [90] | Media consumption | HHD | Collocated |
BeimlerBruder and Steinicke [95] | Animation application | PC + HMD + SAR | Collocated |
BollamGothwalTejaswi V et al. [96] | Chess board game | HMD | Collocated |
BoonbrahmKaewrat and Boonbrahm [87] | 3D puzzle game | HHD | Remote |
BourdinSanahujaMoya et al. [97] | Entertainment, singing | HMD + CAVE | Remote |
BrondiAvvedutoAlem et al. [98] | 3D jigsaw puzzle game | HMD | Remote |
Ch'ngHarrison and Moore [99] | Interactive art | SAR | Collocated |
CourchesneDurand and Roy [100] | Interactive art | Others | Remote |
Dal CorsoOlsenSteenstrup et al. [101] | Game | SAR | Collocated |
DatcuLukosch and Lukosch [102] | Game | PC + HMD | Remote |
DatcuLukosch and Lukosch [103] | 3D block game | PC + HMD | Remote |
FigueroaHernándezMerienne et al. [104] | Game | PC + HMD | Variable |
FischbachLugrinBrandt et al. [105] | Board game | Tabletop | Collocated |
FischbachStriepeLatoschik et al. [106] | Board game | SAR | Collocated |
GüntherMüllerSchmitz et al. [107] | Chess board game | HHD + HMD | Collocated |
HuoWangParedes et al. [108] | Coin collection game | HHD | Collocated |
KarakottasPapachristouDoumanoqlou et al. [109] | Immersive game | HHD + HMD | Remote |
LantinOverstall and Zhao [110] | Media art | HMD | Collocated |
LoviskaKrauseEngelbrecht et al. [111] | Game | HMD | Collocated |
Mackamul and Esteves [112] | Game, match pairs | HHD + SAR | Collocated |
Margolis and Cornish [113] | Cinema production | Combination | Remote |
McGillWilliamson and Brewster [114] | Media consumption | HMD | Remote |
MechtleySteinRoberts et al. [115] | Media arts | SAR | Collocated |
PilliasRobert-Bouchard and Levieux [116] | Tangible video game | Others | Collocated |
Podkosova and Kaufmann [117] | Game | HMD | Variable |
PrinsGunkelStokking et al. [118] | Media consumption | PC + HMD | Remote |
ReillySalimianMacKay et al. [19] | Game, privacy and security | Tabletop + Public Display | Variable |
RostamiBexell and Stanisic [119] | Immersive performance | HMD | Remote |
SatoHwang and Koike [120] | Game | SAR | Collocated |
SpielmannSchusterGötz et al. [121] | Film making | HHD + HMD | Collocated |
TrottnowGötzSeibert et al. [122] | Cinema production | PC + HHD + HMD | Collocated |
Valverde and Cochrane [123] | Performing arts | Others | Variable |
Van Troyer [124] | Theatre performance | Others | Remote |
VermeerAlakade Bruin et al. [125] | Game, lasers | HHD | Collocated |
WegnerSeeleBuhler et al. [126] | Game | HMD | Collocated |
ZhouHagemannFels et al. [127] | 3D game and mental puzzle | Others | Collocated |
ZimmererFischbach and Latoschik [128] | Tabletop game | HHD + Tabletop | Collocated |
BallagasDuganRevelle et al. [90] developed a multi-player augmented reality game which is layered on top of an Emmy Award winning television show, The Electric Company. The authors developed the game to facilitate collocated collaboration and learning between siblings during joint consumption of media. They used different prototypes that combined mobile phones and web-based video. In the final game design, siblings must collaborate to collect and return words stolen by the prankster Manny who is a character in the award-winning show. The authors observed nine pairs of siblings aged between 6 to 10 and took video recordings while the siblings were playing the game. After the game, the children were interviewed in a semi-structured way by a researcher. The video recordings were then reviewed to analyse interactions and conversations made by the siblings. During the pilot test, the authors found that siblings made sense of media content better when using their application. In addition, they noticed that siblings displayed physical coordination by gesturing, verbally referencing physical objects, and guiding each other.
Although gaming applications in MR have already been developed, most of them are mainly accessible through expensive devices (such as HTC Vive, Oculus Rift, or Microsoft Mixed Reality devices and HoloLens) with different platforms and frameworks. The need to develop applications that allow players to join together regardless of the platform they own makes it hard to support collaborative scenarios in MR [91]. However, several factors, including advances in mobile connectivity and computing power and an increasing number of technology companies providing APIs (e.g. Apple ARKit, Vuforia and Google ARCore) with collaborative features for developers, make collaborative MR more accessible and attract more people to use MR for entertainment and gaming. Furthermore, companies like Facebook are enabling shared experiences between users at remote locations, in which they can interact using virtual avatars that typically reflect user movements captured by the user's HMD, external sensors, or controller input. In addition, the proliferation of collaborative MR gaming applications can be attributed to the increasing availability and accessibility of support and creation of complex networked applications by game engines (such as Unity3D, Unreal Engine, etc.). This opens a huge area for exploration on collaborative MR gaming especially driven using mobile platforms, paving the way for more widespread adoption. Natural user interactions without causing user fatigue should be researched and developed for sustained usage of these technologies. Both collocated and remote collaboration environments could potentially help drive the Entertainment and Gaming application area.
There was a total of twenty-two papers which used collaborative MR environments for industrial applications. All papers aimed to improve tasks during repair and maintenance of equipment, as well as manufacturing and assembly-related tasks. In this application area, collaboration where local users are being assisted or guided by remote users was the most common setup with a total of seventeen papers. Three papers had a collocated setup and only two papers have a variable setup. Nine papers used a combination of different display devices for their collaborative MR setup, six papers had a collaborative setup where remote users used desktop displays, while local users used HHDs, HMDs or projector systems, seven papers allowed users to collaborate using the same type of display device (HHDs or HMDs) (see Table 4). It is notable that less intrusive display devices were favoured so that local users can use both of their hands to accomplish tasks in industrial applications.
References | Topic | Display Devices Used | Collaboration Setup |
AbramoviciWolfAdwernat et al. [130] | Maintenance | HHD | Collocated |
AschenbrennerLiDukalski et al. [131] | Production Line Planning | HMD | Variable |
BednarzJamesWidzyk-Capehart et al. [132] | Mining Industry | Combination | Remote |
CapodieciMainetti and Alem [133] | Maintenance | HMD + Multitouch | Remote |
ChoiKim and Lee [134] | Industry | HHD | Remote |
ClergeaudRooHachet et al. [135] | Industry | HMD + Spatial | Remote |
DatcuCidotaLukosch et al. [136] | Inflight Maintenance | Combination | Remote |
DomovaVartiainen and Englund [137] | Industry | PC + HHD | Remote |
ElvezioSukanOda et al. [138] | Assembly, maintenance | HMD | Remote |
FunkKritzler and Michahelles [139] | Assembly | HMD | Collocated |
GalambosCsapóZentay et al. [140] | Manufacturing | Combination | Remote |
GalambosBaranyi and Rudas [141] | Manufacturing | Others | Remote |
GauglitzNuernbergerTurk et al. [129] | Car repair | PC + HHD | Remote |
GuptaUcler and Bernard [142] | New product development, Aviation industry | HMD | Remote |
GurevichLanir and Cohen [143] | Industry | PC + SAR | Remote |
GüntherKratzAvrahami et al. [144] | Industry | PC + HMD | Remote |
MorosiCarliCaruso et al. [145] | Product design | HHD + SAR | Collocated |
PlopskiFuvattanasilpPoldi et al. [146] | Maintenance | HHD | Remote |
SeoLeePark et al. [147] | Industry | Combination | Variable |
ZenatiHamidiaBellarbi et al. [148] | Maintenance | PC + HMD | Remote |
ZenatiBenbelkacemBelhocine et al. [149] | Maintenance | PC + HMD | Remote |
Zenati-HendaBellarbiBenbelkacem et al. [150] | Maintenance | HMD + Multitouch | Remote |
One of the papers which is making the most impact in this application area was written by GauglitzNuernbergerTurk et al. [129]. The authors developed a system which allows live mobile remote collaboration on car repair tasks. Local workers use a lightweight tablet, while remote experts use a commodity PC. Remote experts can communicate with local workers and place spatial annotations which are automatically reflected in the local user's view. In addition, remote experts can independently navigate through the local worker's scene. The authors used proxy tasks, that would allow users to communicate while doing little or no physical labour, in order to create a controlled study setup. They evaluated the effectiveness of their collaborative application through an extensive outdoor within-subject design with 60 participants. They recorded the number of user errors and obtained user feedback through post-study surveys. In addition, they compared their system with two baseline interfaces: a video-only interface and a video interface annotated with static information. Their application was found to be preferred and usable by most (80%) of their test participants.
A majority of the papers in this application area focused on the repair and maintenance of equipment, as well as manufacturing and assembly-related tasks. Many of these studies have explored a remote expert collaboration setup where remote expert users provided help to local, usually less experienced users. In such collaboration setup, the main concern is how to establish an effective way of communication between the remote user and the local user. Promising progress has already been made to convey information, especially spatial information, from drawing annotations and pointer cues to reconstructing part or the whole body of the remote expert. It has been found that the projection of the remote user's hands gesture to a local user's environment is an effective way of providing task awareness and decreasing cognitive load [151]. It is promising to explore improvements in the non-verbal communication cues used during remote collaboration. Further work needs to be done to easily convey complex instructions during industrial tasks. In a remote collaboration scenario, the remote user can have either an independent view or a dependent view of a shared environment of the local user. Investigating the effects of the remote user view on remote collaboration is another important research topic. It has been found that the independent view had several benefits over the dependent view [17]. Our survey showed that less intrusive devices, which allow local users to freely use both hands for task completion, are desirable for this application area. Finally, this application area can also benefit from asynchronous collaboration. The workflow of people working in the industry usually involves shifts where outgoing workers hand over their work to incoming workers. Outgoing workers need to be able to recount all completed tasks to provide incoming workers insight of their overall progress. Retaining their actions through spatial information within the workplace will be a revolutionary step for a more effective and efficient workflow.
One of the most promising areas of collaborative MR is in medical sciences. There were twelve papers in this application area. Different medical experts can visualise the same information and share their knowledge to generate more meaningful insights from medical information. Nine papers had a remote collaboration setup where care is provided to a patient through the use of a remote specialist, two papers had a variable collaboration setup, and one paper had a collocated collaboration setup. Six papers used HMDs, three papers used HHDs, one used PCs and HMDs, one used HHDs and HMDs, and one used a combination of the aforementioned devices (see Table 5).
References | Topic | Display Devices Used | Collaboration Setup |
AlharthiSharmaSunka et al. [153] | Disaster Response | HHD + HMD | Collocated |
CarboneFreschiMascioli et al. [154] | Telemedicine | HMD | Remote |
ElvezioLingLiu et al. [155] | Rehabilitation | HMD | Variable |
DavisCanPindrik et al. [152] | Remote surgery | HHD | Remote |
GillisCalyamApperson et al. [156] | Response Team | HMD | Remote |
KurilloYangShia et al. [157] | Telemedicine | PC + HMD | Remote |
NunesNedel and Roesler [158] | Exercise game | Others | Remote |
NunesLucasSimõ es-Marques et al. [159] | Disaster Response | HHD | Variable |
PopescuLăptoiuMarinescu et al. [160] | Orthopaedic Surgery | HHD | Remote |
ShluzasAldaz and Leifer [161] | Telemedicine | HMD | Remote |
Sirilak and Muneesawang [162] | Telemedicine | HMD | Remote |
VassellAppersonCalyam et al. [163] | Response Team | HMD | Remote |
DavisCanPindrik et al. [152] developed an iPad-based tool, called Virtual Interactive Presence and Augmented Reality (VIPAR), which allows experienced surgeons to provide remote, real-time, and virtual guidance to local surgeons. Local and remote surgeons from Vietnam and the US can perform endoscopic third ventriculostomy with choroid plexus coagulation with an aid of a composite image of video feeds with VIPAR. Fifteen procedures were performed with the use of VIPAR between Vietnam and the US, with no significant complications. The authors performed subjective and objective evaluations of the system performance through questionnaires. The survey showed that local and remote surgeons found VIPAR to be very useful for operating neurosurgeons.
Collaborative MR applications can support and enhance communication between medical specialists and remote patients. It will be promising to develop more platforms which will allow remote experts to guide or assist a local expert in performing surgeries, rehabilitation and recovery, and other medical procedures. E-consultation services could prove to be useful in improving patient services especially for patient in rural, remote, and under-serviced regions, allowing patients to receive higher quality services, and providing greater access to better healthcare services [86]. Similar to the effect of the internet-based telemedicine which provided a cost-efficient form of communication [157], collaborative MR applications may drive down medical costs. However, this will require studies to include more quantitative measures on performance measures, equipment alignment accuracy, and latency during collaboration. These need to be taken into account as poor performance can lead to deaths. The effectiveness of using HMDs, when compared against HHDs and traditional desktop configurations, has shown that they can combine the real world and the virtual world, and allows for interactions in-situ at the positions of 3D virtual models [164]. MR technology is efficient in the aspects of affecting depth perception, task completion, and social presence [165]. Utilizing these aspects can be applied to the improvement of medical education and healthcare services [162]. Through this, MR applications can assist medical practitioners in making more informed decisions, for instance when deciding whether to carry out surgery or not. This can be driven both by real medical data supported by simulation and feedback through MR environments.
Tourism and heritage are the application areas with the least number of collaborative MR applications with only eight papers. In this application area, providing navigational aids that are overlaid on the real environment is essential to help users plan their movements using spatial knowledge they have gained about the environment. If the provided navigation support is insufficient, users become disoriented and get lost. Therefore, developing efficient techniques for guiding the attention of users towards virtual objects or points of interest in an environment is the main focus of the current research in this application area. Out of eight papers, four focused on scene exploration, two focused on museum-based applications for cultural exploration, one focused on collaborative wayfinding, and one which focused on land navigation. Five papers used a remote collaboration setup, three papers used a collocated setup. Three papers used HHDs, two papers used PC and either HHDs or HMDs, two papers used SARs, and one paper used HHDs + HMDs for their collaboration setup (see Table 6).
References | Topic | Display Devices Used | Collaboration Setup |
Camps-OrtuetaRodríguez-MuñozGómez-Martín et al. [166] | Museum visit | HHD | Collocated |
ChenLeeSwift et al. [167] | Scene exploration | PC + HMD | Remote |
GleasonFiannacaKneisel et al. [168] | Scene exploration | HHD + HMD | Collocated |
HuangKaminskiLuo et al. [169] | Museum visit | HHD | Collocated |
KallioniemiHeimonenTurunen et al. [170] | Scene exploration | SAR | Remote |
LiNittalaSharlin et al. [171] | Land exploration | HHD | Remote |
NuernbergerLienGrinta et al. [172] | Scene exploration | PC + HHD | Remote |
KallioniemiHakulinenKeskinen et al. [173] | Wayfinding | SAR | Remote |
ChenLeeSwift et al. [167] developed an interaction model for supporting live remote collaboration between users. They presented a cost-effective system where remote users can use inexpensive devices to see the local users' view. In their paper, the authors illustrated how their system can be used during cave exploration. Local users can scan the physical space and create a 3D reconstructed model which can be annotated by remote users. The novelty of this system is that only a single HoloLens user is required to support collaboration in MR. In addition, they introduced a screen lock mechanism which allows remote users to create accurate and stable 3D annotations even though local users move their head around.
Just like entertainment and gaming applications, this application area will greatly benefit from using HHDs, as opposed to HMDs, since these devices are convenient for people to use while traveling. A novel direction in this application area is to provide a shared MR environment where users can create, share, and collect helpful information about physical objects and interesting locations and to help users thoroughly explore a new location or discover previously unknown features of familiar environments. By using their mobile phone cameras, users can point at a certain location, see reviews written by different people, and ask for directions when they get lost. This can be helpful for tourists visiting a new country and for people visiting a museum or a heritage site. One major advantage of using MR technology in this application area is that different positional information can be overlaid on the real environment to provide spatial awareness and guide the attention of users. However, MR environments can potentially contain a huge number of virtual objects at different locations. This problem is further complicated by the limited field of view of current MR devices, making it difficult to explore and navigate an MR environment. Although different techniques have already been developed, it still remains a challenging task as to how to convey information about surrounding virtual objects to the user [174]. Recent research has shown the potential of the use of multimodal feedbacks, such as audio, visual, and haptic cues, in enriching the communication and assistance between remotely located users [144]. It is promising to explore the effectiveness of multimodal feedbacks in providing instructions and directions during a collocated or remote collaboration in an MR environment.
Our study finds that there are three complementary factors to support and enhance collaboration in MR environments: (ⅰ) annotation techniques, which provide non-verbal communication cues to users, (ⅱ) cooperative object manipulation techniques, which make complex 3D object manipulation easier by dividing different tasks, such as scaling, translation, and rotation, between users, and (ⅲ) user perception and cognition studies, which aim to lessen cognitive workload for task understanding and completion and to increase users' perceptual awareness and presence. In this section, the reviewed papers are analysed based from these three complementary factors.
Traditional approaches to remote guidance through phone or video calls limit how a remote expert can provide instructions and convey spatial references to a local user. Using speech to describe spatial locations and actions can be ambiguous or vague, leading to confusion and error [175]. In contrast, MR environments enable a remote expert to overlay information for spatial referencing on a local user's environment and to allow a local user to view the remote expert's annotations directly overlaid on the environment. Remote collaborative MR setups leverage annotation techniques to improve communication between remote users. A total of 89 papers reported that they explored annotation techniques to support and enhance collaboration among users in MR environments. Interestingly, 28 out of the 89 papers (or 31%) reported using a combination of either desktop computers and HHDs or HMDs, 22 papers (or 25%) used just HMDs, 13 papers (or 15%) used just HHDs, 11 papers (or 12%) used a combination of the different display devices, 8 papers (or 9%) used SAR through projector systems, and 7 papers (or 8%) used a combination of HHDs and HMDs (see Table 7).
SodhiJonesForsyth et al. [176] presented a proof-of-concept design to explore 3D gestures and spatial inputs in collaborative MR. Remote users can perform a variety of virtual interactions which can be displayed in local user's environment. The authors used depth sensors to capture the 3D shape of objects in front of the sensor, as well as to track the location of the user's fingers. They provided qualitative user feedback from a preliminary study which indicated that users could perform collaborative tasks easily using their system.
The survey showed that recent research explores alternative ways to improve communication between remotely located users. Although providing instructions can be made through voice or video calls, it is usually hard to convey complex instructions which require spatial context. Annotation is one of the mostly studied visual communication cues for presenting spatial information in MR environments [17]. In a collaborative MR environment, annotations can include individual gaze directions, pointers from collaborators, hand gestures, and even avatars that reflect collaborators' actions. It is important to support annotations and non-verbal communication cues in collaborative MR environments as they require fewer inputs on the expert side and require less cognitive load on the local worker side [151]. It is highly recommended to display stabilised annotations at the real world where they were drawn to enhance mutual collaboration in MR environments [279]. Manual and automatic freezing of the shared video are the usual approaches to stabilise annotations in the real world. Manual freeze function allows a remote user to manually freeze the live video, to draw on the still video frame, and then to return back to the live video feed [129,279]. With the automatic freezing, the freezing and unfreezing interface was seamlessly integrated with the drawing interaction so that the live video automatically freezes when the remote user starts drawing and unfreezes when they stopped drawing [240,283]. The techniques are used to prevent annotations being anchored at a wrong place when a local user unexpectedly changes the viewpoint while the remote user is drawing.
A majority of the papers used mobile touchscreen displays to create annotations in 3D. Touchscreen displays allow direct interaction and provide instant haptic feedback. However, current mobile phones do not have advanced sensors for 3D depth perception. To support annotations in 3D, researchers must develop robust and efficient techniques to automatically infer depth for 2D drawings and create world-stabilised annotations in 3D. With the advent of new HMDs such as the Microsoft HoloLens, new types of annotations can be created.
A total of 55 papers reported the implementation of cooperative object manipulation techniques. Out of the 55 papers, 19 papers (or 35%) used a combination of the different types of display devices, 16 papers (or 29%) used HMDs, 13 papers (or 24%) used HHDs, 3 papers (or 5%) used HHDs and HMDs, 2 papers (or 4%) used PC and SAR and 2 papers (or 4%) used SAR (see Table 7). Again, a trend of using HMDs is prominent in this application area. A majority of the reviewed papers used collocated MR setups.
Interactions in MR environments may be very complex, depending on the degrees of freedom (DOFs) required for the task. 3D object manipulation can be accomplished through different tasks such as scaling, translation, and rotation. Recent research proposes that collaboration can be used to solve this problem. Users can choose which transformations they want to perform on the object, and the effect of each transformation is combined to produce the final transformation. Through user studies, collaborating users were found to perform better than individual ones [18]. This strengthens the idea that collaborative MR experiences contribute positively towards task completion.
The lack of interaction and object manipulation techniques during collaborative problem-solving leaves outstanding challenges that need to be addressed before collaborative MR become widely accepted by the community. Further work needs to be done to provide the ability to smoothly and naturally interact during face-to-face and remote collaborations in shared workspaces. More natural interactions, such as new gesture and gazed-based interactions, are novel directions in this area.
Cortés-Dávalos and Mendoza [187] developed a novel approach to the collaborative modelling of Digital Elevation Maps (DEMs) that are commonly used to model geometric assets (e.g. terrains on a landscape). In comparison to traditional applications for editing DEMs, their application allows a group of collaborators to use HHDs and easily visualise and modify 3D representations in an intuitive way. Their studies have shown that the perceived workload was considerably small because the sense of structure, shape, and size of DEMs were improved through the AR technology.
Object manipulation greatly affects user experience in all application areas. It is important that object manipulation techniques are intuitive and seamless so that users can interact with virtual content effortlessly. The implementation of different manipulation techniques that are universal and work across MR is an interesting research direction. In addition, it is important to make sure that they do not cause fatigue to users under prolonged engagement with the technology. Qualitative and quantitative evaluations must be done to assess their usability. User attitudes towards interaction techniques must also be taken into account when designing user interaction with collaborative MR systems.
During collaboration, collaborators make a joint effort to align and integrate their activities in a 'seamless' manner to achieve a common goal whilst not interrupting each other [284]. In this process, it is key to be aware of what is going on in the shared workspace and in understanding the collaborators' activities. With the different types of display devices available, collaboration setups, and varying user interaction and experience, collaborative MR environments present different communication channels and different level of awareness [285]. Perception and cognition studies are interesting topics in collaborative MR research. Current researches investigated how different factors, such as types of display devices used and collaboration setup, affect the feeling of enjoyment and togetherness of users in a collaborative MR environment. In addition, recent work has explored the level of awareness and understanding of the participants collaborating in a shared workspace [17].
A total of 112 papers studied how MR enhances the sense of presence and the perception of social awareness, situational awareness and task awareness during collaboration. There was also a considerable amount of research which studied how collaboration reduces cognitive workload through MR environments. 40 papers (or 36%) used HMDs, 18 papers (or 16%) used a combination of either desktop computers and HHDs or HMDs, 18 papers (or 16%) used HHDs, 17 papers (or 15%) used a combination of different display devices, 10 papers (or 9%) used SAR, and 9 papers (or 8%) used HHDs and HMDs (see Table 7). There was a significant amount of perception and cognition studies observed in remote collaborative MR setups.
KimLeeSakata et al. [279] conducted a user study on how the experience of sharing remote tasks and collaborating can be improved by adding visual communication cues in the environment. They developed a live remote collaboration system where local users use either HHDs or HMDs and remote users use a desktop computer. To investigate the experience of sharing a remote task space and collaborating with someone, they compared three video-conferencing conditions with different combinations of communication cues: shared live video only, shared live video with a shared pointer, and shared live video combined with annotations. They found that adding visual cues, such as pointers and annotations, significantly improved the sense of presence, togetherness, and connectedness between users.
Choosing the types of display devices, collaboration setups, and user interaction and experience aspects to be used in a collaborative MR environment is a crucial step as different media provide different communication channels and different levels of awareness [285]. A number of papers have explored ways to improve the sense of presence and social awareness during remote collaboration. The analysis showed that remote users usually feel disconnected to local users due to several factors. Non-verbal communication cues, such as gaze [286,287], pointing and hand gestures [150,151,250,266,277], are important to give instructions since they are usually limited during remote collaboration. In addition, the limited perspective of the local users' environment greatly affects task awareness for remote users. Recent research provided an independent view for the remote user by physically controlling the camera. The results showed the effectiveness of this technique leading to better awareness and understanding of the shared activity in a collaborative MR environment.
Crime scene investigation usually requires the involvement of different organisations. Police officers and fire fighters are often the first to respond to a crime. They usually scout the area to make sure that it is safe for crime scene investigators to enter the scene. Once declared safe, the scene will be searched for crucial traces and evidence. During this time, it is important to not pollute the crime scene for better investigation. This application area can benefit from collaborative MR system.
DatcuLukosch and Lukosch [196] reported on the development and evaluation of a mobile AR system which supports collaboration among collocated and remote forensic investigators. Local investigators run the system on a smartphone strapped on their wrists, while remote investigators can see local users' view on a laptop. The authors evaluated the usability of their system and its effect on collaboration quality and situational awareness. It was found that although the mobile AR system addressed the limitations of HMD-based AR systems, the divided attention between the smartphone and the real environment greatly impacted the situational awareness.
Collaborative MR applications can improve collaboration between experts during crime scene investigation. This application area will also greatly benefit from asynchronous collaboration. With the advances of 3D depth sensors, it is promising to see MR applications where first responders in a crime scene can scan the room and ensure that evidence is free of tampering. The scanned crime scene can then be reviewed by multiple people in the head office, annotated during investigation, and archived for further investigation. As more evidence is added to the scene, new annotations can be created whilst previous annotations can be loaded to provide a bigger picture of the crime scene. Also, digitally recorded and reconstructed crime scenes can be investigated in completely new ways. For instance, measuring various distances, annotating timelines to enhance storytelling outcomes, and reconstructing possible scenarios by adding simulation into the mix.
Decision-making based on data analysis often requires different experts to be effective. Collaborative MR applications can make data analysis more fluid by providing a shared environment where experts have a sense of each other's presence. Unlike traditional computer setups, collaborative MR applications provide a means to visualise large amount of data to be analysed simultaneously. Furthermore, they can provide more natural user interactions as compared to traditional computer setups.
ButscherHubenschmidMüller et al. [268] presented the Augmented Reality above Tabletop (ART) which is a collaborative tool designed for multidimensional, abstract data analysis. It uses multiple scatter plots and provides linked connection between data points by creating a 3D parallel coordinate plots (PCP). ART is designed to work in HMDs and is anchored to a multitouch tabletop, enabling users with familiar and fluid interactions. It was found that ART allows for a more natural communication and coordination between collaborators. Additionally, it can facilitate data immersion and foster a more fluid analysis process.
Collaborative MR applications seem to naturally support collaboration as they provide a shared environment where users can discuss and analyse information. An interesting research direction is to support non-linear analysis workflows where users can save analysis states for sharing and consumption at different times. In addition, the integration of basic statistical operations with visual representations is a novel direction in this area. Users should be able perform both manual and automatic operations, such as filtering, clustering, dimensionality reduction, to analyse and explore multidimensional data better and generate more meaningful insights.
For the past years, industries and research domains have encountered a rapid growth in terms of scale, complexity, and interdisciplinarity. Thus, complex problems now require more knowledge than any single person possesses because the experience and expertise relevant to a problem is usually distributed among different professionals. Bringing different and sometimes opposing points of view together to create a shared understanding of the problem among different stakeholders can lead to new insights, innovative ideas, and interesting artefacts. Although it is ideal when different professionals are all present during discussions, it might not always be the case due to varying schedules. In order to prevent the aforementioned problems, current researches are utilizing Mixed Reality (MR) devices to support and improve coordination, discussion, and collaboration between different collocated and remote experts.
In this paper, 259 collaborative Mixed Reality (MR) papers published in a wide range of journals and conferences from 2013-2018 were reviewed. This was done to establish the current state of MR studies. The reviewed papers were categorised into application areas, types of display devices used, collaboration setups, and user interaction and user experience aspects. In the period given, collaborative MR applications were primarily used in application areas such as Education and Training, Entertainment and Gaming, Industrial, Architecture, Engineering, Construction, and Operations. Although there are relatively fewer papers in the field of Medicine and Tourism and Heritage, recent researches have proven that collaborative MR applications provide valuable impact in these application areas. In the period given, HMDs were the most popular display device used during collaborative MR. With the recent advances in mobile technology, a potential shift in the preferred display device (e.g. HHDs) and opportunities for increased studies in this area can be expected. Remote collaboration was seen to benefit from collaborative MR applications. However, a collaborative MR system which can support both remote and collocated collaboration can be a very powerful tool for discussion and analysis of complex problems and situations. Finally, our survey showed that attention has been given to synchronous collaboration setup to enhance the shared workspace. However, collaboration usually requires different experts who may not always be present. Asynchronous collaborative MR systems bridge this gap by providing the creation and retention of information and its consumption at a later time. Among other things, our study finds that there are three complementary factors to support and enhance collaboration in MR environments:
(ⅰ) Annotation techniques were found to be useful for providing spatial information and conveying instructions as they require fewer user inputs and less cognitive load to understand.
(ⅱ) Cooperative object manipulation techniques in collaborative MR were observed to decrease completion times of various tasks as multiple users can manipulate the same object at the same time.
(ⅲ) Perception and cognition studies were also improved in collaborative MR applications. MR applications provide more situational, social, and task awareness which made users more productive in both data-finding and problem prediction during collaborative discussion processes.
This research is supported by an Australian Government Research Training Program (RTP) Scholarship, and the Expanded Perception and Interaction Centre located at the UNSW Art & Design.
The authors declare that there is no conflict of interest.
[1] | Dey A, Billinghurst M, Lindeman RW, et al. (2018) A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014. Frontiers in Robotics and AI 5. |
[2] |
Bai Z, Blackwell AF (2012) Analytic review of usability evaluation in ISMAR. Interact Comput 24: 450–460. doi: 10.1016/j.intcom.2012.07.004
![]() |
[3] | Dünser A, Grasset R, Billinghurst M (2008) A survey of evaluation techniques used in augmented reality studies. Human Interface Technology Laboratory New Zealand. |
[4] | Swan JE, Gabbard JL (2005) Survey of user-based experimentation in augmented reality. In: Proceedings of 1st International Conference on Virtual Reality 22: 1–9. |
[5] | Milgram P, Kishino F (1994) A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems 77: 1321–1329. |
[6] | Azuma RT (1997) A survey of augmented reality. Presence: Teleoperators & Virtual Environments 6: 355–385. |
[7] |
Milgram P, Takemura H, Utsumi A, et al. (1995) Augmented reality: A class of displays on the reality-virtuality continuum. Telemanipulator and Telepresence Technologies 2351: 282–293. International Society for Optics and Photonics. doi: 10.1117/12.197321
![]() |
[8] | Irizarry J, Gheisari M, Williams G, et al. (2013) InfoSPOT: A mobile Augmented Reality method for accessing building information through a situation awareness approach. Automat Constr 33: 11–23. |
[9] |
Ibáñez MB, Di Serio Á, Villarán D, et al. (2014) Experimenting with electromagnetism using augmented reality: Impact on flow student experience and educational effectiveness. Comput Educ 71: 1–13. doi: 10.1016/j.compedu.2013.09.004
![]() |
[10] |
Henderson S, Feiner S (2011) Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE transactions on visualization and computer graphics 17: 1355–1368. doi: 10.1109/TVCG.2010.245
![]() |
[11] | Dow S, Mehta M, Harmon E, et al. (2007) Presence and engagement in an interactive drama. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1475–1484, ACM. |
[12] | Billinghurst M, Kato H (1999) Collaborative mixed reality. In: Proceedings of the First International Symposium on Mixed Reality, pp. 261–284, Berlin: Springer Verlag. |
[13] | Wang X, Dunston PS (2006) Groupware concepts for augmented reality mediated human-to-human collaboration. In: Proceedings of the 23rd Joint International Conference on Computing and Decision Making in Civil and Building Engineering, pp. 1836–1842. |
[14] | Brockmann T, Krüger N, Stieglitz S, et al. (2013) A Framework for Collaborative Augmented Reality Applications. In 19th Americas Conference on Information Systems (AMCIS). |
[15] | Renevier P, Nigay L (2001) Mobile collaborative augmented reality: the augmented stroll. In: IFIP International Conference on Engineering for Human-Computer Interaction, pp. 299–316, Springer, Berlin, Heidelberg. |
[16] |
Arias E, Eden H, Fischer G, et al. (2000) Transcending the individual human mind-creating shared understanding through collaborative design. ACM Transactions on Computer-Human Interaction 7: 84–113. doi: 10.1145/344949.345015
![]() |
[17] | Kim S, Billinghurst M, Lee GA (2018) The Effect of Collaboration Styles and View Independence on Video-Mediated Remote Collaboration. Computer Supported Cooperative Work (CSCW) 27: 569–607. |
[18] | Cabral M, Roque G, Nagamura M, et al. (2016) Batmen-Hybrid collaborative object manipulation using mobile devices. In: 2016 IEEE Symposium on3D User Interfaces (3DUI), pp. 275–276. |
[19] | Reilly D, Salimian M, MacKay B, et al. (2014) SecSpace: prototyping usable privacy and security for mixed reality collaborative environments. In: Proceedings of the 2014 ACM SIGCHI symposium on Engineering interactive computing systems, pp. 273–282. |
[20] | Lin T-H, Liu C-H, Tsai M-H, et al. (2014) Using augmented reality in a multiscreen environment for construction discussion. J Comput Civil Eng 29: 04014088. |
[21] |
Hollenbeck JR, Ilgen DR, Sego DJ, et al. (1995) Multilevel theory of team decision making: Decision performance in teams incorporating distributed expertise. Journal of Applied Psychology 80: 292–316. doi: 10.1037/0021-9010.80.2.292
![]() |
[22] |
Lightle JP, Kagel JH, Arkes HR (2009) Information exchange in group decision making: The hidden profile problem reconsidered. Manage Sci 55: 568–581. doi: 10.1287/mnsc.1080.0975
![]() |
[23] | Gül LF, Uzun C, Halıcı SM (2017) Studying Co-design. In: International Conference on Computer-Aided Architectural Design Futures, pp. 212–230. |
[24] |
Al-Hammad A, Assaf S, Al-Shihah M (1997) The effect of faulty design on building maintenance. Journal of Quality in Maintenance Engineering 3: 29–39. doi: 10.1108/13552519710161526
![]() |
[25] | Casarin J, Pacqueriaud N, Bechmann D (2018) UMI3D: A Unity3D Toolbox to Support CSCW Systems Properties in Generic 3D User Interfaces. Proceedings of the ACM on Human-Computer Interaction 2: 29. |
[26] | Coppens A, Mens T (2018) Towards Collaborative Immersive Environments for Parametric Modelling. In: International Conference on Cooperative Design, Visualization and Engineering, pp. 304–307, Springer. |
[27] | Cortés-Dávalos A, Mendoza S (2016) Layout planning for academic exhibits using Augmented Reality. In: 2016 13th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE), pp. 1–6, IEEE. |
[28] | Croft BL, Lucero C, Neurnberger D, et al. (2018) Command and Control Collaboration Sand Table (C2-CST). In: International Conference on Virtual, Augmented and Mixed Reality, pp. 249–259, Springer. |
[29] |
Dong S, Behzadan AH, Chen F, et al. (2013) Collaborative visualization of engineering processes using tabletop augmented reality. Adv Eng Softw 55: 45–55. doi: 10.1016/j.advengsoft.2012.09.001
![]() |
[30] | Elvezio C, Ling F, Liu J-S, et al. (2018) Collaborative exploration of urban data in virtual and augmented reality. In: ACM SIGGRAPH 2018 Virtual, Augmented, and Mixed Reality, p. 10, ACM. |
[31] | Etzold J, Grimm P, Schweitzer J, et al. (2014) kARbon: a collaborative MR web application for communicationsupport in construction scenarios. In: Proceedings of the companion publication of the 17th ACM conference on Computer supported cooperative work & social computing, pp. 9–12, ACM. |
[32] | Flotyński J, Sobociński P (2018) Semantic 4-dimensionai modeling of VR content in a heterogeneous collaborative environment. In: Proceedings of the 23rd International ACM Conference on 3D Web Technology, p. 11, ACM. |
[33] | Ibayashi H, Sugiura Y, Sakamoto D, et al. (2015) Dollhouse vr: a multi-view, multi-user collaborative design workspace with vr technology. SIGGRAPH Asia 2015 Emerging Technologies, p. 8, ACM. |
[34] | Leon M, Doolan DC, Laing R, et al. (2015) Development of a Computational Design Application for Interactive Surfaces. In: 2015 19th International Conference on Information Visualisation, pp. 506–511, IEEE. |
[35] |
Li WK, Nee AYC, Ong SK (2018) Mobile augmented reality visualization and collaboration techniques for on-site finite element structural analysis. International Journal of Modeling, Simulation, and Scientific Computing 9: 1840001. doi: 10.1142/S1793962318400019
![]() |
[36] | Nittala AS, Li N, Cartwright S, et al. (2015) PLANWELL: spatial user interface for collaborative petroleum well-planning. In: SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications, p. 19, ACM. |
[37] | Phan T, Hönig W, Ayanian N (2018) Mixed Reality Collaboration Between Human-Agent Teams. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 659–660. |
[38] | Rajeb SB, Leclercq P (2013) Using spatial augmented reality in synchronous collaborative design. In: International Conference on Cooperative Design, Visualization and Engineering, pp. 1–10, Springer. |
[39] | Ro H, Kim I, Byun J, et al. (2018) PAMI: Projection Augmented Meeting Interface for Video Conferencing. In: 2018 ACM Multimedia Conference on Multimedia Conference, pp. 1274–1277, ACM. |
[40] | Schattel D, Tönnis M, Klinker G, et al. (2014) On-site augmented collaborative architecture visualization. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 369–370. |
[41] | Shin JG, Ng G, Saakes D (2018) Couples Designing their Living Room Together: a Study with Collaborative Handheld Augmented Reality. In: Proceedings of the 9th Augmented Human International Conference, p. 3, acm. |
[42] |
Singh AR, Delhi VSK (2018) User behaviour in AR-BIM-based site layout planning. International Journal of Product Lifecycle Management 11: 221–244. doi: 10.1504/IJPLM.2018.094715
![]() |
[43] | Trout TT, Russell S, Harrison A, et al. (2018) Collaborative mixed reality (MxR) and networked decision making. In: Next-Generation Analyst VI 10653: 106530N. International Society for Optics and Photonics. |
[44] | Alhumaidan H, Lo KPY, Selby A (2017) Co-designing with children a collaborative augmented reality book based on a primary school textbook. International Journal of Child-Computer Interaction 15: 24–36. |
[45] | Alhumaidan H, Lo KPY, Selby A (2015) Co-design of augmented reality book for collaborative learning experience in primary education. In: 2015 SAI Intelligent Systems Conference (IntelliSys), pp. 427–430, IEEE. |
[46] | Benavides X, Amores J, Maes P (2015) Invisibilia: revealing invisible data using augmented reality and internet connected devices. In: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 341–344, ACM. |
[47] |
Blanco-Fernández Y, López-Nores M, Pazos-Arias JJ, et al. (2014) REENACT: A step forward in immersive learning about Human History by augmented reality, role playing and social networking. Expert Syst Appl 41: 4811–4828. doi: 10.1016/j.eswa.2014.02.018
![]() |
[48] | Boyce MW, Rowan CP, Baity DL, et al. (2017) Using Assessment to Provide Application in Human Factors Engineering to USMA Cadets. In: International Conference on Augmented Cognition, pp. 411–422, Springer. |
[49] |
Bressler DM, Bodzin AM (2013) A mixed methods assessment of students' flow experiences during a mobile augmented reality science game. Journal of Computer Assisted Learning 29: 505–517. doi: 10.1111/jcal.12008
![]() |
[50] | Chen M, Fan C, Wu D (2016) Designing Effective Materials and Activities for Mobile Augmented Learning. In: International Conference on Blended Learning, pp. 85–93, Springer. |
[51] | Daiber F, Kosmalla F, Krüger A (2013) BouldAR: using augmented reality to support collaborative boulder training. In: CHI' 13 Extended Abstracts on Human Factors in Computing Systems, pp. 949–954, ACM. |
[52] | Desai K, Belmonte UHH, Jin R, et al. (2017) Experiences with Multi-Modal Collaborative Virtual Laboratory (MMCVL). In: 2017 IEEE Third International Conference on Multimedia Big Data (BigMM), pp. 376–383, IEEE. |
[53] | Fleck S, Simon G (2013) An augmented reality environment for astronomy learning in elementary grades: An exploratory study. In: Proceedings of the 25th Conference on I'Interaction Homme-Machine, p. 14, ACM. |
[54] | Gazcón N, Castro S (2015) ARBS: An Interactive and Collaborative System for Augmented Reality Books. In: International Conference on Augmented and Virtual Reality, pp. 89–108, Springer. |
[55] | Gelsomini F, Kanev K, Hung P, et al. (2017) BYOD Collaborative Kanji Learning in Tangible Augmented Reality Settings. In: International Conference on Global Research and Education, pp. 315–325, Springer. |
[56] | Gironacci IM, Mc-Call R, Tamisier T (2017) Collaborative Storytelling Using Gamification and Augmented Reality. In: International Conference on Cooperative Design, Visualization and Engineering, pp. 90–93, Springer. |
[57] | Goyal S, Vijay RS, Monga C, et al. (2016) Code Bits: An Inexpensive Tangible Computational Thinking Toolkit For K-12 Curriculum. In: Proceedings of the TEI'16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 441–447, ACM. |
[58] | Greenwald SW (2015) Responsive Facilitation of Experiential Learning Through Access to Attentional State. In: Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, pp. 1–4, ACM. |
[59] |
Han J, Jo M, Hyun E, et al. (2015) Examining young children's perception toward augmented reality-infused dramatic play. Educational Technology Research and Development 63: 455–474. doi: 10.1007/s11423-015-9374-9
![]() |
[60] |
Iftene A, Trandabăț D (2018) Enhancing the Attractiveness of Learning through Augmented Reality. Procedia Computer Science 126: 166–175. doi: 10.1016/j.procs.2018.07.220
![]() |
[61] | Jyun-Fong G, Ju-Ling S (2013) The Instructional Application of Augmented Reality in Local History Pervasive Game. pp. 387. |
[62] | Kang S, Norooz L, Oguamanam V, et al. (2016) SharedPhys: Live Physiological Sensing, Whole-Body Interaction, and Large-Screen Visualizations to Support Shared Inquiry Experiences. In: Proceedings of the The 15th International Conference on Interaction Design and Children, pp. 275–287, ACM. |
[63] | Kazanidis I, Palaigeorgiou G, Papadopoulou Α, et al. (2018) Augmented Interactive Video: Enhancing Video Interactivity for the School Classroom. Journal of Engineering Science and Technology Review 11. |
[64] | Keifert D, Lee C, Dahn M, et al. (2017) Agency, Embodiment, & Affect During Play in a Mixed-Reality Learning Environment. In: Proceedings of the 2017 Conference on Interaction Design and Children, pp. 268–277, ACM. |
[65] |
Kim H-J, Kim B-H (2018) Implementation of young children English education system by AR type based on P2P network service model. Peer-to-Peer Networking and Applications 11: 1252–1264. doi: 10.1007/s12083-017-0612-2
![]() |
[66] | Krstulovic R, Boticki I, Ogata H (2017) Analyzing heterogeneous learning logs using the iterative convergence method. In: 2017 IEEE 6th International Conference on Teaching, Assessment, and Learning for Engineering, pp. 482–485. |
[67] | Le TN, Le YT, Tran MT (2014) Applying Saliency-Based Region of Interest Detection in Developing a Collaborative Active Learning System with Augmented Reality. In: International Conference on Virtual, Augmented and Mixed Reality, pp. 51–62, Springer. |
[68] | MacIntyre B, Zhang D, Jones R, et al. (2016) Using projection ar to add design studio pedagogy to a cs classroom. In: 2016 IEEE Virtual Reality (VR), pp. 227–228. |
[69] | Malinverni L, Valero C, Schaper MM, et al. (2018) A conceptual framework to compare two paradigms of augmented and mixed reality experiences. In: Proceedings of the 17th ACM Conference on Interaction Design and Children, pp. 7–18, ACM. |
[70] | Maskott GK, Maskott MB, Vrysis L (2015) Serious+: A technology assisted learning space based on gaming. In: 2015 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL), pp. 430–432, IEEE. |
[71] | Pareto L (2012) Mathematical literacy for everyone using arithmetic games. In: Proceedings of the 9th International Conference on Disability, Virtual Reality and Associated Technologies 9: 87–96. Reading, UK: University of Readings. |
[72] | Peters E, Heijligers B, de Kievith J, et al. (2016) Design for collaboration in mixed reality: Technical challenges and solutions. In: 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES), pp. 1–7, IEEE. |
[73] | Punjabi DM, Tung LP, Lin BSP (2013) CrowdSMILE: a crowdsourcing-based social and mobile integrated system for learning by exploration. In: 2013 IEEE 10th International Conference on Ubiquitous Intelligence and Computing and 2013 IEEE 10th International Conference on Autonomic and Trusted Computing, pp. 521–526. |
[74] | Rodríguez-Vizzuett L, Pérez-Medina JL, Muñoz-Arteaga J, et al. (2015) Towards the Definition of a Framework for the Management of Interactive Collaborative Learning Applications for Preschoolers. In: Proceedings of the XVI International Conference on Human Computer Interaction, p. 11, ACM. |
[75] | Sanabria JC, Arámburo-Lizárraga J (2017) Enhancing 21st Century Skills with AR: Using the Gradual Immersion Method to develop Collaborative Creativity. Eurasia Journal of Mathematics, Science and Technology Education 13: 487–501. |
[76] |
Shaer O, Valdes C, Liu S, et al. (2014) Designing reality-based interfaces for experiential bio-design. Pers Ubiquit Comput 18: 1515–1532. doi: 10.1007/s00779-013-0752-1
![]() |
[77] | Shirazi A, Behzadan AH (2015) Content Delivery Using Augmented Reality to Enhance Students' Performance in a Building Design and Assembly Project. Advances in Engineering Education 4. |
[78] | Shirazi A, Behzadan AH (2013) Technology-enhanced learning in construction education using mobile context-aware augmented reality visual simulation. In: 2013 Winter Simulations Conference (WSC), pp. 3074–3085, IEEE. |
[79] | Sun H, Liu Y, Zhang Z, et al. (2018) Employing Different Viewpoints for Remote Guidance in a Collaborative Augmented Environment. In: Proceedings of the Sixth International Symposium of Chinese CHI, pp. 64–70, ACM. |
[80] | Sun H, Zhang Z, Liu Y, et al. (2016) OptoBridge: assisting skill acquisition in the remote experimental collaboration. In: Proceedings of the 28th Australian Conference on Computer-Human Interaction, pp. 195–199, ACM. |
[81] | Thompson B, Leavy L, Lambeth A, et al. (2016) Participatory Design of STEM Education AR Experiences for Heterogeneous Student Groups: Exploring Dimensions of Tangibility, Simulation, and Interaction. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp. 53–58. |
[82] | Wiehr F, Kosmalla F, Daiber F, et al. (2016) betaCube: Enhancing Training for Climbing by a Self-Calibrating Camera-Projection Unit. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 1998–2004, ACM. |
[83] | Yangguang L, Yue L, Xiaodong W (2014) Multiplayer collaborative training system based on Mobile AR innovative interaction technology. In: 2014 International Conference on Virtual Reality and Visualization, pp. 81–85, IEEE. |
[84] | Yoon SA, Wang J, Elinich K (2014) Augmented reality and learning in science museums. Digital Systems for Open Access to Formal and Informal Learning, pp. 293–305, Springer. |
[85] |
Zubir F, Suryani I, Ghazali N (2018) Integration of Augmented Reality into College Yearbook. In: MATEC Web of Conferences 150: 05031. EDP Sciences. doi: 10.1051/matecconf/201815005031
![]() |
[86] | Dascalu MI, Moldoveanu A, Shudayfat EA (2014) Mixed reality to support new learning paradigms. In: 2014 8th International Conference on System Theory, Control and Computing (ICSTCC), pp. 692–697, IEEE. |
[87] | Boonbrahm P, Kaewrat C, Boonbrahm S (2016) Interactive Augmented Reality: A New Approach for Collaborative Learning. In: International Conference on Learning and Collaboration Technologies, pp. 115–124, Springer. |
[88] | LaViola Jr JJ, Kruijff E, McMahan RP, et al. (2017) 3D user interfaces: theory and practice. Addison-Wesley Professional. |
[89] | Kim S, Lee GA, Sakata N (2013) Comparing pointing and drawing for remote collaboration. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 1–6, IEEE. |
[90] | Akahoshi S, Matsushita M (2018) Magical Projector: Virtual Object Sharing Method among Multiple Users in a Mixed Reality Space. In: 2018 Nicograph International (NicoInt), pp. 70–73, IEEE. |
[91] | Baillard C, Fradet M, Alleaume V, et al. (2017) Multi-device mixed reality TV: a collaborative experience with joint use of a tablet and a headset. In: Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, p. 67, ACM. |
[92] | Baldauf M, Fröhlich P (2013) The augmented video wall: multi-user AR interaction with public displays. In: CHI'13 Extended Abstracts on Human Factors in Computing Systems, pp. 3015–3018, ACM. |
[93] | Ballagas R, Dugan TE, Revelle G, et al. (2013) Electric agents: fostering sibling joint media engagement through interactive television and augmented reality. In: Proceedings of the 2013 conference on Computer supported cooperative work, pp. 225–236, ACM. |
[94] | Beimler R, Bruder G, Steinicke F (2013) Smurvebox: A smart multi-user real-time virtual environment for generating character animations. In: Proceedings of the Virtual Reality International Conference: Laval Virtual, p. 1, ACM. |
[95] | Bollam P, Gothwal E, Tejaswi V G, et al. (2015) Mobile collaborative augmented reality with real-time AR/VR switching. In: ACM SIGGRAPH 2015 Posters, p. 25, ACM. |
[96] | Bourdin P, Sanahuja JMT, Moya CC, et al. (2013) Persuading people in a remote destination to sing by beaming there. In: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology, pp. 123–132, ACM. |
[97] | Brondi R, Avveduto G, Alem L, et al. (2015) Evaluating the effects of competition vs collaboration on user engagement in an immersive game using natural interaction. In: Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology, p. 191, ACM. |
[98] | Ch'ng E, Harrison D, Moore S (2017) Shift-life interactive art: Mixed-reality artificial ecosystem simulation. Presence: Teleoperators & Virtual Environments 26: 157–181. |
[99] |
Courchesne L, Durand E, Roy B (2014) Posture platform and the drawing room: virtual teleportation in cyberspace. Leonardo 47: 367–374. doi: 10.1162/LEON_a_00842
![]() |
[100] | Dal Corso A, Olsen M, Steenstrup KH, et al. (2015) VirtualTable: a projection augmented reality game. In: SIGGRAPH Asia 2015 Posters, p. 40, ACM. |
[101] | Datcu D, Lukosch S, Lukosch H (2016) A Collaborative Game to Study Presence and Situational Awareness in a Physical and an Augmented Reality Environment. J Univers Comput Sci 22: 247–270. |
[102] | Datcu D, Lukosch SG, Lukosch HK (2014) A collaborative game to study the perception of presence during virtual co-location. In: Proceedings of the companion publication of the 17th ACM conference on Computer supported cooperative work & social computing, pp. 5–8, ACM. |
[103] | Figueroa P, Hernández JT, Merienne F, et al. (2018) Heterogeneous, distributed mixed reality Applications. A concept. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 549–550. |
[104] | Fischbach M, Lugrin J-L, Brandt M, et al. (2018) Follow the White Robot-A Role-Playing Game with a Robot Game Master. In: Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems, pp. 1812–1814. |
[105] | Fischbach M, Striepe H, Latoschik ME, et al. (2016) A low-cost, variable, interactive surface for mixed-reality tabletop games. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pp. 297–298, ACM. |
[106] | Günther S, Müller F, Schmitz M, et al. (2018) CheckMate: Exploring a Tangible Augmented Reality Interface for Remote Interaction. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, p. LBW570, ACM. |
[107] | Huo K, Wang T, Paredes L, et al. (2018) SynchronizAR: Instant Synchronization for Spontaneous and Spatial Collaborations in Augmented Reality. In: Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, pp. 19–30, ACM. |
[108] | Karakottas A, Papachristou A, Doumanoqlou A, et al. (2018) Augmented VR. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 18–22, IEEE. |
[109] | Lantin M, Overstall SL, Zhao H (2018) I am afraid: voice as sonic sculpture. In: ACM SIGGRAPH 2018 Posters, pp. 1–2, ACM. |
[110] | Loviska M, Krause O, Engelbrecht HA, et al. (2016) Immersed gaming in Minecraft. In: Proceedings of the 7th International Conference on Multimedia Systems, p. 32, ACM. |
[111] | Mackamul EB, Esteves A (2018) A Look at the Effects of Handheld and Projected Augmented-reality on a Collaborative Task. In: Proceedings of the Symposium on Spatial User Interaction, pp. 74–78, ACM. |
[112] |
Margolis T, Cornish T (2013) Vroom: designing an augmented environment for remote collaboration in digital cinema production. In: The Engineering Reality of Virtual Reality 2013 8649: 86490F. International Society for Optics and Photonics. doi: 10.1117/12.2008587
![]() |
[113] | McGill M, Williamson JH, Brewster SA (2016) Examining the role of smart TVs and VR HMDs in synchronous at-a-distance media consumption. ACM T Comput-Hum Int 23: 33. |
[114] | Mechtley B, Stein J, Roberts C, et al. (2017) Rich State Transitions in a Media Choreography Framework Using an Idealized Model of Cloud Dynamics. In: Proceedings of the onThematic Workshops of ACM Multimedia 2017, pp. 477–484, ACM. |
[115] | Pillias C, Robert-Bouchard R, Levieux G (2014) Designing tangible video games: lessons learned from the sifteo cubes. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 3163–3166, ACM. |
[116] | Podkosova I, Kaufmann H (2018) Co-presence and proxemics in shared walkable virtual environments with mixed collocation. In: Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, pp. 21, ACM. |
[117] | Prins MJ, Gunkel SN, Stokking HM, et al. (2018) TogetherVR: A framework for photorealistic shared media experiences in 360-degree VR. SMPTE Motion Imag J 127: 39–44. |
[118] | Rostami A, Bexell E, Stanisic S (2018) The Shared Individual. In: Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 511–516, ACM. |
[119] | Sato T, Hwang DH, Koike H (2018) MlioLight: Projector-camera Based Multi-layered Image Overlay System for Multiple Flashlights Interaction. In: Proceedings of the 2018 ACM International Conference on Interactive Surfaces and Spaces, pp. 263–271, ACM. |
[120] | Spielmann S, Schuster A, Götz K, et al. (2016) VPET: a toolset for collaborative virtual filmmaking. In: SIGGRAPH ASIA 2016 Technical Briefs, p. 29, ACM. |
[121] | Trottnow J, Götz K, Seibert S, et al. (2015) Intuitive virtual production tools for set and light editing. In: Proceedings of the 12th European Conference on Visual Media Production, p. 6, ACM. |
[122] | Valverde I, Cochrane T (2017) Senses Places: soma-tech mixed-reality participatory performance installation/environment. In: Proceedings of the 8th International Conference on Digital Arts, pp. 195–197, ACM. |
[123] | Van Troyer A (2013) Enhancing site-specific theatre experience with remote partners in sleep no more. In: Proceedings of the 2013 ACM International workshop on Immersive media experiences, pp. 17–20, ACM. |
[124] | Vermeer J, Alaka S, de Bruin N, et al. (2018) League of lasers: a superhuman sport using motion tracking. In: Proceedings of the First Superhuman Sports Design Challenge on First International Symposium on Amplifying Capabilities and Competing in Mixed Realities, p. 8, ACM. |
[125] | Wegner K, Seele S, Buhler H, et al. (2017) Comparison of Two Inventory Design Concepts in a Collaborative Virtual Reality Serious Game. In: Extended Abstracts Publication of the Annual Symposium on Computer-Human Interaction in Play, pp. 323–329, ACM. |
[126] | Zhou Q, Hagemann G, Fels S, et al. (2018) Coglobe: a co-located multi-person FTVR experience. In: ACM SIGGRAPH 2018 Emerging Technologies, p. 5, ACM. |
[127] | Zimmerer C, Fischbach M, Latoschik ME (2014) Fusion of Mixed-Reality Tabletop and Location-Based Applications for Pervasive Games. In: Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, pp. 427–430, ACM. |
[128] | Speicher M, Hall BD, Yu A, et al. (2018) XD-AR: Challenges and Opportunities in Cross-Device Augmented Reality Application Development. Proceedings of the ACM on Human-Computer Interaction 2: 7. |
[129] | Gauglitz S, Nuernberger B, Turk M, et al. (2014) World-stabilized annotations and virtual scene navigation for remote collaboration. In: Proceedings of the 27th Annual ACM symposium on User interface software and technology, pp. 449–459, ACM. |
[130] |
Abramovici M, Wolf M, Adwernat S, et al. (2017) Context-aware Maintenance Support for Augmented Reality Assistance and Synchronous Multi-user Collaboration. Procedia CIRP 59: 18–22. doi: 10.1016/j.procir.2016.09.042
![]() |
[131] | Aschenbrenner D, Li M, Dukalski R, et al. (2018) Collaborative Production Line Planning with Augmented Fabrication. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 509–510, IEEE. |
[132] | Bednarz T, James C, Widzyk-Capehart E, et al. (2015) Distributed collaborative immersive virtual reality framework for the mining industry. Machine Vision and Mechatronics in Practice, pp. 39–48, Springer. |
[133] | Capodieci A, Mainetti L, Alem L (2015) An innovative approach to digital engineering services delivery: An application in maintenance. In: 2015 11th International Conference on Innovations in Information Technology (IIT), pp. 342–349, IEEE. |
[134] |
Choi SH, Kim M, Lee JY (2018) Situation-dependent remote AR collaborations: Image-based collaboration using a 3D perspective map and live video-based collaboration with a synchronized VR mode. Comput Ind 101: 51–66. doi: 10.1016/j.compind.2018.06.006
![]() |
[135] | Clergeaud D, Roo JS, Hachet M, et al. (2017) Towards seamless interaction between physical and virtual locations for asymmetric collaboration. In: Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, pp. 1–4, ACM. |
[136] | Datcu D, Cidota M, Lukosch SG, et al. (2014) Virtual co-location to support remote assistance for inflight maintenance in ground training for space missions. In: Proceedings of the 15th International Conference on Computer Systems and Technologies, pp. 134–141, ACM. |
[137] | Domova V, Vartiainen E, Englund M (2014) Designing a remote video collaboration system for industrial settings. In: Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, pp. 229–238, ACM. |
[138] | Elvezio C, Sukan M, Oda O, et al. (2017) Remote collaboration in AR and VR using virtual replicas. In: ACM SIGGRAPH 2017 VR Village, p. 13, ACM. |
[139] | Funk M, Kritzler M, Michahelles F (2017) HoloCollab: A Shared Virtual Platform for Physical Assembly Training using Spatially-Aware Head-Mounted Displays. In: Proceedings of the Seventh International Conference on the Internet of Things, p. 19, ACM. |
[140] |
Galambos P, Csapó ÁB, Zentay PZ, et al. (2015) Design, programming and orchestration of heterogeneous manufacturing systems through VR-powered remote collaboration. Robotics and Computer-Integrated Manufacturing 33: 68–77. doi: 10.1016/j.rcim.2014.08.012
![]() |
[141] | Galambos P, Baranyi PZ, Rudas IJ (2014) Merged physical and virtual reality in collaborative virtual workspaces: The VirCA approach. In: IECON 2014 – 40th Annual Conference of the IEEE Industrial Electronics Society, pp. 2585–2590, IEEE. |
[142] | Gupta RK, Ucler C, Bernard A (2018) Extension of the Virtual Customer Inspection for Distant Collaboration in NPD. In: 2018 IEEE International Conference on Engineering, Technology and Innovation, pp. 1–7. |
[143] |
Gurevich P, Lanir J, Cohen B (2015) Design and implementation of teleadvisor: a projection-based augmented reality system for remote collaboration. Computer Supported Cooperative Work (CSCW) 24: 527–562. doi: 10.1007/s10606-015-9232-7
![]() |
[144] | Günther S, Kratz SG, Avrahami D, et al. (2018) Exploring Audio, Visual, and Tactile Cues for Synchronous Remote Assistance. In: Proceedings of the 11th Pervasive Technologies Related to Assistive Environments Conference, pp. 339–344, ACM. |
[145] | Morosi F, Carli I, Caruso G, et al. (2018) Analysis of Co-Design Scenarios and Activities for the Development of A Spatial-Augmented Reality Design Platform. In: DS 92: Proceedings of the DESIGN 2018 15th International Design Conference, pp. 381–392. |
[146] | Plopski A, Fuvattanasilp V, Poldi J, et al. (2018) Efficient In-Situ Creation of Augmented Reality Tutorials. In: 2018 Workshop on Metrology for Industry 4.0 and IoT, pp. 7–11, IEEE. |
[147] | Seo D-W, Lee S-M, Park K-S, et al. (2015) INTEGRATED ENGINEERING PRODUCT DESIGN SIMULATION PLATFORM FOR COLLABORATIVE SIMULATION UNDER THE USER EXPERIENCE OF SME USERS. simulation 1: 2. |
[148] | Zenati N, Hamidia M, Bellarbi A, et al. (2015) E-maintenance for photovoltaic power system in Algeria. In: 2015 IEEE International Conference on Industrial Technology, pp. 2594–2599. |
[149] | Zenati N, Benbelkacem S, Belhocine M, et al. (2013) A new AR interaction for collaborative E-maintenance system. IFAC Proceedings Volumes 46: 619–624. |
[150] | Zenati-Henda N, Bellarbi A, Benbelkacem S, et al. (2014) Augmented reality system based on hand gestures for remote maintenance. In: 2014 International Conference on Multimedia Computing and Systems (ICMCS), pp. 5–8, IEEE. |
[151] | Huang W, Billinghurst M, Alem L, et al. (2018) HandsInTouch: sharing gestures in remote collaboration. In: Proceedings of the 30th Australian Conference on Computer-Human Interaction, pp. 396–400, ACM. |
[152] |
Davis MC, Can DD, Pindrik J, et al. (2016) Virtual interactive presence in global surgical education: international collaboration through augmented reality. World neurosurgery 86: 103–111. doi: 10.1016/j.wneu.2015.08.053
![]() |
[153] | Alharthi SA, Sharma HN, Sunka S, et al. (2018) Designing Future Disaster Response Team Wearables from a Grounding in Practice. In: Proceedings of the Technology, Mind, and Society, p. 1, ACM. |
[154] | Carbone M, Freschi C, Mascioli S, et al. (2016) A wearable augmented reality platform for telemedicine. In: International Conference on Augmented Reality, Virtual Reality and Computer Graphics, pp. 92–100, Springer. |
[155] | Elvezio C, Ling F, Liu J-S, et al. (2018) Collaborative Virtual Reality for Low-Latency Interaction. In: The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings, pp. 179–181, ACM. |
[156] | Gillis J, Calyam P, Apperson O, et al. (2016) Panacea's Cloud: Augmented reality for mass casualty disaster incident triage and co-ordination. In: 2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC), pp. 264–265, IEEE. |
[157] | Kurillo G, Yang AY, Shia V, et al. (2016) New emergency medicine paradigm via augmented telemedicine. In: 8th International Conference on Virtual, Augmented and Mixed Reality, VAMR 2016 and Held as Part of 18th International Conference on Human-Computer Interaction, HCI International 2016, pp. 502–511, Springer. |
[158] | Nunes M, Nedel LP, Roesler V (2013) Motivating people to perform better in exergames: Collaboration vs. competition in virtual environments. In: 2013 IEEE Virtual Reality (VR), pp. 115–116, IEEE. |
[159] | Nunes IL, Lucas R, Simões-Marques M, et al. (2017) Augmented Reality in Support of Disaster Response. In: International Conference on Applied Human Factors and Ergonomics, pp. 155–167, Springer. |
[160] |
Popescu D, Lăptoiu D, Marinescu R, et al. (2017) Advanced Engineering in Orthopedic Surgery Applications. Key Engineering Materials 752: 99–104. doi: 10.4028/www.scientific.net/KEM.752.99
![]() |
[161] | Shluzas LA, Aldaz G, Leifer L (2016) Design Thinking Health: Telepresence for Remote Teams with Mobile Augmented Reality. In: Design Thinking Research, pp. 53–66, Springer. |
[162] |
Sirilak S, Muneesawang P (2018) A New Procedure for Advancing Telemedicine Using the HoloLens. IEEE Access 6: 60224–60233. doi: 10.1109/ACCESS.2018.2875558
![]() |
[163] | Vassell M, Apperson O, Calyam P, et al. (2016) Intelligent Dashboard for augmented reality based incident command response co-ordination. In: 2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC), pp. 976–979, IEEE. |
[164] | Bach B, Sicat R, Beyer J, et al. (2018) The Hologram in My Hand: How Effective is Interactive Exploration of 3D Visualizations in Immersive Tangible Augmented Reality? IEEE Transactions on Visualization & Computer Graphics 24: 457–467. |
[165] | Daher S (2017) Optical see-through vs. spatial augmented reality simulators for medical applications. In: 2017 IEEE Virtual Reality (VR), pp. 417–418. |
[166] | Camps-Ortueta I, Rodríguez-Muñoz JM, Gómez-Martín PP, et al. (2017) Combining augmented reality with real maps to promote social interaction in treasure hunts. CoSECivi, pp. 131–143. |
[167] | Chen H, Lee AS, Swift M, et al. (2015) 3D collaboration method over HoloLens™ and Skype™ end points. In: Proceedings of the 3rd International Workshop on Immersive Media Experiences, pp. 27–30, ACM. |
[168] | Gleason C, Fiannaca AJ, Kneisel M, et al. (2018) FootNotes: Geo-referenced Audio Annotations for Nonvisual Exploration. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2: 109. |
[169] | Huang W, Kaminski B, Luo J, et al. (2015) SMART: design and evaluation of a collaborative museum visiting application. In: 12th International Conference, CDVE 2015 – Cooperative Design, Visualization, and Engineering 12th International Conference 9320: 57–64. |
[170] | Kallioniemi P, Heimonen T, Turunen M, et al. (2015) Collaborative navigation in virtual worlds: how gender and game experience influence user behavior. In: Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology, pp. 173–182, ACM. |
[171] | Li N, Nittala AS, Sharlin E, et al. (2014) Shvil: collaborative augmented reality land navigation. In: CHI'14 Extended Abstracts on Human Factors in Computing Systems, pp. 1291–1296, ACM. |
[172] | Nuernberger B, Lien K-C, Grinta L, et al. (2016) Multi-view gesture annotations in image-based 3D reconstructed scenes. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pp. 129–138, ACM. |
[173] | Kallioniemi P, Hakulinen J, Keskinen T, et al. (2013) Evaluating landmark attraction model in collaborative wayfinding in virtual learning environments. In: Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia, pp. 1–10, ACM. |
[174] |
Bork F, Schnelzer C, Eck U, et al. (2018) Towards Efficient Visual Guidance in Limited Field-of-View Head-Mounted Displays. IEEE transactions on visualization and computer graphics 24: 2983–2992. doi: 10.1109/TVCG.2018.2868584
![]() |
[175] | Sodhi RS, Jones BR, Forsyth D, et al. (2013) BeThere: 3D mobile collaboration with spatial input. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 179–188, ACM. |
[176] | Lien K-C, Nuernberger B, Turk M, et al. (2015) [POSTER] 2D-3D Co-segmentation for AR-based Remote Collaboration. In: 2015 IEEE International Symposium on Mixed and Augmented Reality, pp. 184–185, IEEE. |
[177] | Nuernberger B, Lien K-C, Höllerer T, et al. (2016) Anchoring 2D gesture annotations in augmented reality. In: 2016 IEEE Virtual Reality (VR), pp. 247–248, IEEE. |
[178] | Nuernberger B, Lien K-C, Höllerer T, et al. (2016) Interpreting 2d gesture annotations in 3d augmented reality. In: 2016 IEEE Symposium on 3D User Interfaces (3DUI), pp. 149–158. |
[179] |
Kovachev D, Nicolaescu P, Klamma R (2014) Mobile real-time collaboration for semantic multimedia. Mobile Networks and Applications 19: 635–648. doi: 10.1007/s11036-013-0453-z
![]() |
[180] | You S, Thompson CK (2017) Mobile collaborative mixed reality for supporting scientific inquiry and visualization of earth science data. In: 2017 IEEE Virtual Reality (VR), pp. 241–242. |
[181] | Wiehr F, Daiber F, Kosmalla F, et al. (2017) ARTopos: augmented reality terrain map visualization for collaborative route planning. In: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, pp. 1047–1050, ACM. |
[182] | Müller J, Rädle R, Reiterer H (2017) Remote Collaboration With Mixed Reality Displays: How Shared Virtual Landmarks Facilitate Spatial Referencing. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 6481–6486, ACM. |
[183] | Park S, Kim J (2018) Augmented Memory: Site-Specific Social Media with AR. In: Proceedings of the 9th Augmented Human International Conference, p. 41, ACM. |
[184] | Ryskeldiev B, Igarashi T, Zhang J, et al. (2018) Spotility: Crowdsourced Telepresence for Social and Collaborative Experiences in Mobile Mixed Reality. In: Companion of the 2018 ACM Conference on Computer Supported Cooperative Work and Social Computing, pp. 373–376, ACM. |
[185] | Grandi JG, Berndt I, Debarba HG, et al. (2017) Collaborative manipulation of 3D virtual objects in augmented reality scenarios using mobile devices. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI), pp. 264–265, IEEE. |
[186] | Cortés-Dávalos A, Mendoza S (2016) AR-based Modeling of 3D Objects in Multi-user Mobile Environments. In: CYTED-RITOS International Workshop on Groupware, pp. 21–36, Springer. |
[187] | Cortés-Dávalos A, Mendoza S (2016) Augmented Reality-Based Groupware for Editing 3D Surfaces on Mobile Devices. In: 2016 International Conference on Collaboration Technologies and Systems (CTS), pp. 319–326, IEEE. |
[188] | Zhang W, Han B, Hui P, et al. (2018) CARS: Collaborative Augmented Reality for Socialization. In: Proceedings of the 19th International Workshop on Mobile computing Systems & Applications, pp. 25–30, ACM. |
[189] | Cortés-Dávalos A, Mendoza S (2016) Collaborative Web Authoring of 3D Surfaces Using Augmented Reality on Mobile Devices. In: 2016 IEEE/WIC/ACM International Conference on Web Intelligence (WI), pp. 640–643, IEEE. |
[190] | Pani M, Poiesi F (2018) Distributed Data Exchange with Leap Motion. International Conference on Augmented Reality, Virtual Reality, and Computer Graphics, pp. 655–667, Springer. |
[191] | Grandi JG, Debarba HG, Bemdt I, et al. (2018) Design and Assessment of a Collaborative 3D Interaction Technique for Handheld Augmented Reality. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 49–56. |
[192] | Müller J, Rädle R, Reiterer H (2016) Virtual Objects as Spatial Cues in Collaborative Mixed Reality Environments: How They Shape Communication Behavior and User Task Load. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1245–1249, ACM. |
[193] | Müller J, Butscher S, Feyer SP, et al. (2017) Studying collaborative object positioning in distributed augmented realities. In: Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia, pp. 123–132, ACM. |
[194] | Francese R, Passero I, Zarraonandia T (2012) An augmented reality application to gather participant feedback during a meeting. In: Information systems: crossroads for organization, management, accounting and engineering, pp. 173–180. |
[195] | Datcu D, Lukosch SG, Lukosch HK (2016) Handheld Augmented Reality for Distributed Collaborative Crime Scene Investigation. In: Proceedings of the 19th International Conference on Supporting Group Work, pp. 267–276, ACM. |
[196] | Pece F, Steptoe W, Wanner F, et al. (2013) Panoinserts: mobile spatial teleconferencing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1319–1328, ACM. |
[197] | Cai M, Masuko S, Tanaka J (2018) Gesture-based Mobile Communication System Providing Side-by-side Shopping Feeling. In: Proceedings of the 23rd International Conference on Intelligent User Interfaces Companion, p. 2, ACM. |
[198] | Chang YS, Nuernberger B, Luan B, et al. (2017) Gesture-based augmented reality annotation. In: 2017 IEEE Virtual Reality (VR), pp. 469–470, IEEE. |
[199] | Le Chénéchal M, Duval T, Gouranton V, et al. (2016) Vishnu: virtual immersive support for helping users an interaction paradigm for collaborative remote guiding in mixed reality. In: 2016 IEEE Third VR International Workshop on Collaborative virtual Environments (3DCVE), pp. 9–12. |
[200] | Piumsomboon T, Lee Y, Lee GA, et al. (2017) Empathic Mixed Reality: Sharing What You Feel and Interacting with What You See. In: 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), pp. 38–41, IEEE. |
[201] | Piumsomboon T, Lee Y, Lee G, et al. (2017) CoVAR: a collaborative virtual and augmented reality system for remote collaboration. In: SIGGRAPH Asia 2017 Emerging Technologies, p. 3, ACM. |
[202] | Lee Y, Masai K, Kunze KS, et al. (2016) A Remote Collaboration System with Empathy Glasses. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp. 342–343, IEEE. |
[203] | Piumsomboon T, Dey A, Ens B, et al. (2017) [POSTER] CoVAR: Mixed-Platform Remote Collaborative Augmented and Virtual Realities System with Shared Collaboration Cues. In: 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp. 218–219, IEEE. |
[204] | Piumsomboon T, Day A, Ens B, et al. (2017) Exploring enhancements for remote mixed reality collaboration. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, p. 16, ACM. |
[205] | Amores J, Benavides X, Maes P (2015) Showme: A remote collaboration system that supports immersive gestural communication. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 1343–1348, ACM. |
[206] | Yu J, Noh S, Jang Y, et al. (2015) A hand-based collaboration framework in egocentric coexistence reality. In: 2015 12th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), pp. 545–548, IEEE. |
[207] | Piumsomboon T, Lee GA, Hart JD, et al. (2018) Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 46, ACM. |
[208] | Piumsomboon T, Lee GA, Billinghurst M (2018) Snow Dome: A Multi-Scale Interaction in Mixed Reality Remote Collaboration. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, p. D115, ACM. |
[209] | Cidota M, Lukosch S, Datcu D, et al. (2016) Workspace awareness in collaborative AR using HMDS: a user study comparing audio and visual notifications. In: Proceedings of the 7th Augmented Human International Conference 2016, p. 3, ACM. |
[210] | Jo D, Kim K-H, Kim GJ (2016) Effects of avatar and background representation forms to co-presence in mixed reality (MR) tele-conference systems. In: SIGGRAPH Asia 2016 Virtual Reality meets Physical Reality: Modelling and Simulating Virtual Humans and Environments, p. 12, ACM. |
[211] | Yu J, Jeon J-u, Park G, et al. (2016) A Unified Framework for Remote Collaboration Using Interactive AR Authoring and Hands Tracking. In: International Conference on Distributed, Ambient, and Pervasive Interactions, pp. 132–141, Springer. |
[212] | Nassani A, Lee G, Billinghurst M, et al. (2017) [POSTER] The Social AR Continuum: Concept and User Study. In: 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp. 7–8. |
[213] | Gao L, Bai H, Lee G, et al. (2016) An oriented point-cloud view for MR remote collaboration. SIGGRAPH ASIA 2016 Mobile Graphics and Interactive Applications, p. 8, ACM. |
[214] | Lee GA, Teo T, Kim S, et al. (2017) Mixed reality collaboration through sharing a live panorama. SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, p. 14, ACM. |
[215] | Gao L, Bai H, Lindeman R, et al. (2017) Static local environment capturing and sharing for MR remote collaboration. SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, p. 17, ACM. |
[216] | Lee GA, Teo T, Kim S, et al. (2017) Sharedsphere: MR collaboration through shared live panorama. SIGGRAPH Asia 2017 Emerging Technologies, pp. 1–2, ACM. |
[217] | Rühmann LM, Prilla M, Brown G (2018) Cooperative Mixed Reality: An Analysis Tool. In: Proceedings of the 2018 ACM Conference on Supporting Groupwork, pp. 107–111, ACM. |
[218] |
Lee H, Ha T, Noh S, et al. (2013) Context-of-Interest Driven Trans-Space Convergence for Spatial Co-presence. In: Proceedings of the First International Conference on Distributed, Ambient, and Pervasive Interactions 8028: 388–395. doi: 10.1007/978-3-642-39351-8_42
![]() |
[219] | Yang P, Kitahara I, Ohta Y. (2015) [POSTER] Remote Mixed Reality System Supporting Interactions with Virtualized Objects. In: 2015 IEEE International Symposium on Mixed and Augmented Reality, pp. 64–67, IEEE. |
[220] | Benbelkacem S, Zenati-Henda N, Belghit H, et al. (2015) Extended web services for remote collaborative manipulation in distributed augmented reality. In: 2015 3rd International Conference on Control, Engineering & Information Technology (CEIT), pp. 1–5, IEEE. |
[221] |
Pan Y, Sinclair D, Mitchell K (2018) Empowerment and embodiment for collaborative mixed reality systems. Comput Animat Virt W 29: e1838. doi: 10.1002/cav.1838
![]() |
[222] | Drochtert D, Geiger C (2015) Collaborative magic lens graph exploration. In: SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications, p. 25, ACM. |
[223] | Lee J-Y, Kwon J-H, Nam S-H, et al. (2016) Coexistent Space: Collaborative Interaction in Shared 3D Space. In: Proceedings of the 2016 Symposium on Spatial User Interaction, pp. 175–175, ACM. |
[224] | Müller F, Günther S, Nejad AH, et al. (2017) Cloudbits: supporting conversations through augmented zero-query search visualization. In: Proceedings of the 5th Symposium on Spatial User Interaction, pp. 30–38, ACM. |
[225] | Lehment NH, Tiefenbacher P, Rigoll G (2014) Don't Walk into Walls: Creating and Visualizing Consensus Realities for Next Generation Videoconferencing. In: Proceedings, Part I, of the 6th International Conference on Virtual, Augmented and Mixed Reality. Designing and Developing Virtual and Augmented Environments 8525: 170–180. |
[226] | Roth D, Lugrin J-L, Galakhov D, et al. (2016) Avatar realism and social interaction quality in virtual reality. In: 2016 IEEE Virtual Reality (VR), pp. 277–278, IEEE. |
[227] |
Kasahara S, Nagai S, Rekimoto J (2017) JackIn Head: Immersive visual telepresence system with omnidirectional wearable camera. IEEE transactions on visualization and computer graphics 23: 1222–1234. doi: 10.1109/TVCG.2016.2642947
![]() |
[228] | Luongo C, Leoncini P (2018) An UE4 Plugin to Develop CVE Applications Leveraging Participant's Full Body Tracking Data. International Conference on Augmented Reality, Virtual Reality, and Computer Graphics, pp. 610–622. |
[229] |
Piumsomboon T, Lee GA, Ens B, et al. (2018) Superman vs Giant: A Study on Spatial Perception for a Multi-Scale Mixed Reality Flying Telepresence Interface. IEEE Transactions on Visualization and Computer Graphics 24: 2974–2982. doi: 10.1109/TVCG.2018.2868594
![]() |
[230] | Kasahara S, Rekimoto J (2015) JackIn head: immersive visual telepresence system with omnidirectional wearable camera for remote collaboration. In: Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology, pp. 217–225, ACM. |
[231] | Adams H, Thompson C, Thomas D, et al. (2015) The effect of interpersonal familiarity on cooperation in a virtual environment. In: Proceedings of the ACM SIGGRAPH Symposium on Applied Perception, pp. 138–138, ACM. |
[232] | Ryskeldiev B, Cohen M, Herder J (2017) Applying rotational tracking and photospherical imagery to immersive mobile telepresence and live video streaming groupware. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, p. 5. |
[233] | Mai C, Bartsch SA, Rieger L (2018) Evaluating Shared Surfaces for Co-Located Mixed-Presence Collaboration. In: Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, pp. 1–5, ACM. |
[234] | Congdon BJ, Wang T, Steed A (2018) Merging environments for shared spaces in mixed reality. In: Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, p. 11. |
[235] | Gao L, Bai H, He W, et al. (2018) Real-time visual representations for mobile mixed reality remote collaboration. SIGGRAPH Asia 2018 Virtual & Augmented Reality, p. 15. |
[236] | Lee G, Kim S, Lee Y, et al. (2017) [POSTER] Mutually Shared Gaze in Augmented Video Conference. In: 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp. 79–80, IEEE. |
[237] | Tiefenbacher P, Gehrlich T, Rigoll G (2015) Impact of annotation dimensionality under variable task complexity in remote guidance. In: 2015 IEEE Symposium on 3D User Interfaces (3DUI), pp. 189–190, IEEE. |
[238] |
Adcock M, Gunn C (2015) Using Projected Light for Mobile Remote Guidance. Computer Supported Cooperative Work (CSCW) 24: 591–611. doi: 10.1007/s10606-015-9237-2
![]() |
[239] | Kim S, Lee GA, Ha S, et al. (2015) Automatically freezing live video for annotation during remote collaboration. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 1669–1674, ACM. |
[240] |
Tait M, Billinghurst M (2015) The effect of view independence in a collaborative AR system. Computer Supported Cooperative Work (CSCW) 24: 563–589. doi: 10.1007/s10606-015-9231-8
![]() |
[241] | Adcock M, Anderson S, Thomas B (2013) RemoteFusion: real time depth camera fusion for remote collaboration on physical tasks. In: Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, pp. 235–242, ACM. |
[242] | Kim S, Lee GA, Sakata N, et al. (2013) Study of augmented gesture communication cues and view sharing in remote collaboration. In: 2013 IEEE International Symposium on Mixed and Augmented Reality, pp. 261–262, IEEE. |
[243] | Sakata N, Takano Y, Nishida S (2014) Remote Collaboration with Spatial AR Support. In: International Conference on Human-Computer Interaction, pp. 148–157, Springer. |
[244] | Tiefenbacher P, Gehrlich T, Rigoll G, et al. (2014) Supporting remote guidance through 3D annotations. In: Proceedings of the 2nd ACM Symposium on Spatial User Interaction, pp. 141–141, ACM. |
[245] | Tait M, Billinghurst M (2014) View independence in remote collaboration using AR. ISMAR, pp. 309–310. |
[246] | Gauglitz S, Nuernberger B, Turk M, et al. (2014) In touch with the remote world: Remote collaboration with augmented reality drawings and virtual navigation. In: Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology, pp. 197–205, ACM. |
[247] |
Lukosch S, Lukosch H, Datcu D, et al. (2015) Providing information on the spot: Using augmented reality for situational awareness in the security domain. Computer Supported Cooperative Work (CSCW) 24: 613–664. doi: 10.1007/s10606-015-9235-4
![]() |
[248] | Lukosch SG, Lukosch HK, Datcu D, et al. (2015) On the spot information in augmented reality for teams in the security domain. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 983–988, ACM. |
[249] | Yamada S, Chandrasiri NP (2018) Evaluation of Hand Gesture Annotation in Remote Collaboration Using Augmented Reality. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 727–728. |
[250] |
Anton D, Kurillo G, Bajcsy R (2018) User experience and interaction performance in 2D/3D telecollaboration. Future Gener Comp Sy 82: 77–88. doi: 10.1016/j.future.2017.12.055
![]() |
[251] | Tait M, Tsai T, Sakata N, et al. (2013) A projected augmented reality system for remote collaboration. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 1–6, IEEE. |
[252] | Irlitti A, Itzstein GSV, Smith RT, et al. (2014) Performance improvement using data tags for handheld spatial augmented reality. In: Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology, pp. 161–165, ACM. |
[253] |
Iwai D, Matsukage R, Aoyama S, et al. (2018) Geometrically Consistent Projection-Based Tabletop Sharing for Remote Collaboration. IEEE Access 6: 6293–6302. doi: 10.1109/ACCESS.2017.2781699
![]() |
[254] | Pejsa T, Kantor J, Benko H, et al. (2016) Room2room: Enabling life-size telepresence in a projected augmented reality environment. In: Proceedings of the 19th ACM Conference on Conference on Computer-Supported Cooperative Work & Social Computing, pp. 1716–1725, ACM. |
[255] | Schwede C, Hermann T (2015) HoloR: Interactive mixed-reality rooms. In: 2015 6th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), pp. 517–522, IEEE. |
[256] | Salimian MH, Reilly DF, Brooks S, et al. (2016) Physical-Digital Privacy Interfaces for Mixed Reality Collaboration: An Exploratory Study. In: Proceedings of the 2016 ACM International Conference on Interactive Surfaces and Spaces, pp. 261–270, ACM. |
[257] | Weiley V, Adcock M (2013) Drawing in the lamposcope. In: Proceedings of the 9th ACM Conference on Creativity & Cognition, pp. 382–383, ACM. |
[258] | Irlitti A, Itzstein GSV, Alem L, et al. (2013) Tangible interaction techniques to support asynchronous collaboration. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 1–6, IEEE. |
[259] | Kratky A (2015) Transparent touch–interacting with a multi-layered touch-sensitive display system. In: International Conference on Universal Access in Human-Computer Interaction, pp. 114–126, Springer. |
[260] | Moniri MM, Valcarcel FAE, Merkel D, et al. (2016) Hybrid team interaction in the mixed reality continuum. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pp. 335–336, ACM. |
[261] | Seo D, Yoo B, Ko H (2018) Webizing collaborative interaction space for cross reality with various human interface devices. In: Proceedings of the 23rd International ACM Conference on 3D Web Technology, pp. 1–8, ACM. |
[262] | Randhawa JS (2016) Stickie: Mobile Device Supported Spatial Collaborations. In: Proceedings of the 2016 Symposium on Spatial User Interaction, pp. 163–163, ACM. |
[263] | Tabrizian P, Petrasova A, Harmon B, et al. (2016) Immersive tangible geospatial modeling. In: Proceedings of the 24th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, p. 88, ACM. |
[264] | Ren D, Lee B, Höllerer T (2018) XRCreator: interactive construction of immersive data-driven stories. In: Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, p. 136, ACM. |
[265] | Minagawa J, Choi W, Li L, et al. (2016) Development of collaborative workspace system using hand gesture. In: 2016 IEEE 5th Global Conference on Consumer Electronics, pp. 1–2, IEEE. |
[266] | Tanaya M, Yang K, Christensen T, et al. (2017) A Framework for analyzing AR/VR Collaborations: An initial result. In: 2017 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), pp. 111–116, IEEE. |
[267] | Butscher S, Hubenschmid S, Müller J, et al. (2018) Clusters, Trends, and Outliers: How Immersive Technologies Can Facilitate the Collaborative Analysis of Multidimensional Data. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 90, ACM. |
[268] | Machuca MDB, Chinthammit W, Yang Y, et al. (2014) 3D mobile interactions for public displays. In: SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, p. 17, ACM. |
[269] | Ríos AP, Callaghan V, Gardner M, et al. (2014) Interactions within Distributed Mixed Reality Collaborative Environments. In: IE'14 Proceedings of the 2014 International Conference on Intelligent Environments, pp. 382–383. |
[270] | Ueda Y, Iwazaki K, Shibasaki M, et al. (2014) HaptoMIRAGE: mid-air autostereoscopic display for seamless interaction with mixed reality environments. In: ACM SIGGRAPH 2014 Emerging Technologies, p. 10, ACM. |
[271] |
Wang X, Love PED, Kim MJ, et al. (2014) Mutual awareness in collaborative design: An Augmented Reality integrated telepresence system. Computers in Industry 65: 314–324. doi: 10.1016/j.compind.2013.11.012
![]() |
[272] | Komiyama R, Miyaki T, Rekimoto J (2017) JackIn space: designing a seamless transition between first and third person view for effective telepresence collaborations. In: Proceedings of the 8th Augmented Human International Conference, p. 14, ACM. |
[273] | Oyekoya O, Stone R, Steptoe W, et al. (2013) Supporting interoperability and presence awareness in collaborative mixed reality environments. In: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology, pp. 165–174, ACM. |
[274] | Reilly DF, Echenique A, Wu A, et al. (2015) Mapping out Work in a Mixed Reality Project Room. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 887–896, ACM. |
[275] | Dean J, Apperley M, Rogers B (2014) Refining personal and social presence in virtual meetings. In: Proceedings of the Fifteenth Australasian User Interface Conference 150: 67–75. Australian Computer Society, Inc. |
[276] | Robert K, Zhu D, Huang W, et al. (2013) MobileHelper: remote guiding using smart mobile devices, hand gestures and augmented reality. In: SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, p. 39, ACM. |
[277] | Billinghurst M, Nassani A, Reichherzer C (2014) Social panoramas: using wearable computers to share experiences. In: SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, p. 25, ACM. |
[278] | Kim S, Lee G, Sakata N, et al. (2014) Improving co-presence with augmented visual communication cues for sharing experience through video conference. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 83–92, IEEE. |
[279] | Cha Y, Nam S, Yi MY, et al. (2018) Augmented Collaboration in Shared Space Design with Shared Attention and Manipulation. In: The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings, pp. 13–15, ACM. |
[280] | Grandi JG (2017) Design of collaborative 3D user interfaces for virtual and augmented reality. In: 2017 IEEE Virtual Reality (VR), pp. 419–420, IEEE. |
[281] | Koskela T, Mazouzi M, Alavesa P, et al. (2018) AVATAREX: Telexistence System based on Virtual Avatars. In: Proceedings of the 9th Augmented Human International Conference, p. 13, ACM. |
[282] | Heiser J, Tversky B, Silverman M (2004) Sketches for and from collaboration. Visual and spatial reasoning in design III 3: 69–78. |
[283] | Fakourfar O, Ta K, Tang R, et al. (2016) Stabilized annotations for mobile remote assistance. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1548–1560, ACM. |
[284] |
Schmidt K (2002) The problem with 'awareness': Introductory remarks on 'awareness in CSCW'. Computer Supported Cooperative Work (CSCW) 11: 285–298. doi: 10.1023/A:1021272909573
![]() |
[285] |
Olson GM, Olson JS (2000) Distance matters. Human–computer interaction 15: 139–178. doi: 10.1207/S15327051HCI1523_4
![]() |
[286] | Ishii H, Kobayashi M, Arita K (1994) Iterative design of seamless collaboration media. Communications of the ACM 37: 83–97. |
[287] |
Ishii H, Kobayashi M, Grudin J (1993) Integration of interpersonal space and shared workspace: ClearBoard design and experiments. ACM Transactions on Information Systems 11: 349–375. doi: 10.1145/159764.159762
![]() |
1. | Mark McGill, Jan Gugenheimer, Euan Freeman, 2020, A Quest for Co-Located Mixed Reality: Aligning and Assessing SLAM Tracking for Same-Space Multi-User Experiences, 9781450376198, 1, 10.1145/3385956.3418968 | |
2. | Gitte Pedersen, Konstantinos Koumaditis, 2020, Chapter 10, 978-3-030-49697-5, 136, 10.1007/978-3-030-49698-2_10 | |
3. | Vuthea Chheang, Virve Fischer, Holger Buggenhagen, Tobias Huber, Florentine Huettl, Werner Kneist, Bernhard Preim, Patrick Saalfeld, Christian Hansen, Toward interprofessional team training for surgeons and anesthesiologists using virtual reality, 2020, 15, 1861-6410, 2109, 10.1007/s11548-020-02276-y | |
4. | Bernardo Marques, Antonio Teixeira, Samuel Silva, Joao Alves, Paulo Dias, Beatriz Sousa Santos, 2020, A Conceptual Model for Data Collection and Analysis for AR-based Remote Collaboration Evaluation, 978-1-7281-7675-8, 1, 10.1109/ISMAR-Adjunct51615.2020.00016 | |
5. | Diego Vaquero-Melchor, Ana M. Bernardos, Luca Bergesio, SARA: A Microservice-Based Architecture for Cross-Platform Collaborative Augmented Reality, 2020, 10, 2076-3417, 2074, 10.3390/app10062074 | |
6. | Adrian H. Hoppe, Leon Kaucher, Florian van de Camp, Rainer Stiefelhagen, 2020, Chapter 5, 978-3-030-49694-4, 63, 10.1007/978-3-030-49695-1_5 | |
7. | Jose Garcia Estrada, Ekaterina Prasolova-Førland, Running an XR lab in the context of COVID-19 pandemic: Lessons learned from a Norwegian university, 2021, 1360-2357, 10.1007/s10639-021-10446-x | |
8. | Huyen Nguyen, Tomasz Bednarz, 2020, Chapter 3, 978-3-030-62654-9, 41, 10.1007/978-3-030-62655-6_3 | |
9. | Carina Siedler, Moritz Glatt, Pavel Weber, Achim Ebert, Jan C. Aurich, Engineering changes in manufacturing systems supported by AR/VR collaboration, 2021, 96, 22128271, 307, 10.1016/j.procir.2021.01.092 | |
10. | Sergi Fernandez Langa, Mario Montagud Climent, Gianluca Cernigliaro, David Rincon Rivera, Toward Hyper-Realistic and Interactive Social VR Experiences in Live TV Scenarios, 2022, 68, 0018-9316, 13, 10.1109/TBC.2021.3123499 | |
11. | M. van der Westhuizen, K. H. von Leipzig, V. Hummel, 2023, Chapter 31, 978-3-031-15601-4, 415, 10.1007/978-3-031-15602-1_31 | |
12. | Bernardo Marques, Tiago Araujo, Samuel Silva, Joao Alves, Paulo Dias, Beatriz Sousa Santos, 2021, Visually exploring a Collaborative Augmented Reality Taxonomy, 978-1-6654-3827-8, 94, 10.1109/IV53921.2021.00024 | |
13. | Vuthea Chheang, Patrick Saalfeld, Tobias Huber, Florentine Huettl, Werner Kneist, Bernhard Preim, Christian Hansen, 2019, Collaborative Virtual Reality for Laparoscopic Liver Surgery Training, 978-1-7281-5604-0, 1, 10.1109/AIVR46125.2019.00011 | |
14. | Natalia Cooper, Shelley Kelsey, Bruno Emond, Jean-Francois Lapointe, Samantha Astles, Chantal Trudel, 2021, Evaluating VR practices to support collaborative cabin design process using a human factor approach, 978-1-62410-610-1, 10.2514/6.2021-2774 | |
15. | Yun-Peng Su, Xiao-Qi Chen, Tony Zhou, Christopher Pretty, Geoffrey Chase, Mixed-Reality-Enhanced Human–Robot Interaction with an Imitation-Based Mapping Approach for Intuitive Teleoperation of a Robotic Arm-Hand System, 2022, 12, 2076-3417, 4740, 10.3390/app12094740 | |
16. | Federico Manuri, Francesco De Pace, Andrea Sanna, 2022, Chapter 478-1, 978-3-319-08234-9, 1, 10.1007/978-3-319-08234-9_478-1 | |
17. | Peng Wang, Yue Wang, Mark Billinghurst, Huizhen Yang, Peng Xu, Yanhong Li, BeHere: a VR/SAR remote collaboration system based on virtual replicas sharing gesture and avatar in a procedural task, 2023, 1359-4338, 10.1007/s10055-023-00748-5 | |
18. | Bernardo Marques, Samuel Silva, Paulo Dias, Beatriz Sousa Santos, 2023, Chapter 16, 978-3-031-10787-0, 267, 10.1007/978-3-031-10788-7_16 | |
19. | Radostina Petkova, Vladimir Poulkov, Agata Manolova, Krasimir Tonchev, Challenges in Implementing Low-Latency Holographic-Type Communication Systems, 2022, 22, 1424-8220, 9617, 10.3390/s22249617 | |
20. | Vuthea Chheang, Florian Heinrich, Fabian Joeres, Patrick Saalfeld, Roghayeh Barmaki, Bernhard Preim, Christian Hansen, 2022, WiM-Based Group Navigation for Collaborative Virtual Reality, 978-1-6654-5725-5, 82, 10.1109/AIVR56993.2022.00018 | |
21. | Bernardo Marques, Samuel Silva, António Teixeira, Paulo Dias, Beatriz Sousa Santos, A vision for contextualized evaluation of remote collaboration supported by AR, 2022, 102, 00978493, 413, 10.1016/j.cag.2021.10.009 | |
22. | Jiaye Li, Jing Wang, Digital animation multimedia information synthesis based on mixed reality framework with specialized analysis on speech data, 2021, 1381-2416, 10.1007/s10772-021-09940-x | |
23. | Vuthea Chheang, Danny Schott, Patrick Saalfeld, Lukas Vradelis, Tobias Huber, Florentine Huettl, Hauke Lang, Bernhard Preim, Christian Hansen, 2022, Towards Virtual Teaching Hospitals for Advanced Surgical Training, 978-1-6654-8402-2, 410, 10.1109/VRW55335.2022.00089 | |
24. | Mai Otsuki, Tzu-Yang Wang, Hideaki Kuzuoka, 2022, Assessment of Instructor’s Capacity in One-to-Many AR Remote Instruction Giving, 9781450398893, 1, 10.1145/3562939.3565631 | |
25. | Meimei Tuo, Baoxin Long, Gengxin Sun, Construction and Application of a Human-Computer Collaborative Multimodal Practice Teaching Model for Preschool Education, 2022, 2022, 1687-5273, 1, 10.1155/2022/2973954 | |
26. | Benjamin Sho, Ryan Anthony De Belen, Rowan T Hughes, Tomasz Bednarz, 2021, Joint Augmented Reality Video Analytics and Artificial Intelligence Supervision, 9781450386876, 1, 10.1145/3476124.3488652 | |
27. | Kevin Yu, Alexander Winkler, Frieder Pankratz, Marc Lazarovici, Dirk Wilhelm, Ulrich Eck, Daniel Roth, Nassir Navab, 2021, Magnoramas: Magnifying Dioramas for Precise Annotations in Asymmetric 3D Teleconsultation, 978-1-6654-1838-6, 392, 10.1109/VR50410.2021.00062 | |
28. | Peter Schopf, Julia M. Jonas, Kollaboration mit Extended-Reality-Systemen – eine Kategorisierung, 2022, 59, 1436-3011, 177, 10.1365/s40702-021-00823-y | |
29. | Bernardo Marques, Samuel Silva, João Alves, António Rocha, Paulo Dias, Beatriz Sousa Santos, Remote collaboration in maintenance contexts using augmented reality: insights from a participatory process, 2022, 16, 1955-2513, 419, 10.1007/s12008-021-00798-6 | |
30. | Pablo Pérez, Ester Gonzalez-Sosa, Jesús Gutiérrez, Narciso García, Emerging Immersive Communication Systems: Overview, Taxonomy, and Good Practices for QoE Assessment, 2022, 2, 2673-8198, 10.3389/frsip.2022.917684 | |
31. | Vuthea Chheang, Patrick Saalfeld, Tobias Huber, Florentine Huettl, Werner Kneist, Bernhard Preim, Christian Hansen, 2019, An Interactive Demonstration of Collaborative VR for Laparoscopic Liver Surgery Training, 978-1-7281-5604-0, 247, 10.1109/AIVR46125.2019.00055 | |
32. | Alexander Schäfer, Gerd Reis, Didier Stricker, A Survey on Synchronous Augmented, Virtual, andMixed Reality Remote Collaboration Systems, 2023, 55, 0360-0300, 1, 10.1145/3533376 | |
33. | Janet G Johnson, Danilo Gasques, Tommy Sharkey, Evan Schmitz, Nadir Weibel, 2021, Do You Really Need to Know Where “That” Is? Enhancing Support for Referencing in Collaborative Mixed Reality Environments, 9781450380966, 1, 10.1145/3411764.3445246 | |
34. | Roberto Santiago Bellido García, Luis Gerardo Rejas Borjas, Alejandro Cruzata-Martínez, Merce Concepción Sotomayor Mancisidor, The Use of Augmented Reality in Latin-American Engineering Education: A Scoping Review, 2022, 18, 13058215, em2064, 10.29333/ejmste/11485 | |
35. | Thomas Rinnert, James Walsh, Cédric Fleury, Gilles Coppin, Thierry Duval, Bruce H. Thomas, How Can One Share a User’s Activity during VR Synchronous Augmentative Cooperation?, 2023, 7, 2414-4088, 20, 10.3390/mti7020020 | |
36. | Jingqiu Zhang, Gengxin Sun, User Experience Perspectives on the Application of Interactivity Design Based on Sensor Networks in Digital Museum Product Display, 2022, 2022, 1687-7268, 1, 10.1155/2022/8335044 | |
37. | Anjela Mayer, Théo Combe, Jean-Rémy Chardonnet, Jivka Ovtcharova, 2022, Chapter 2, 978-3-031-15552-9, 17, 10.1007/978-3-031-15553-6_2 | |
38. | Bernardo Marques, Samuel Silva, Paulo Dias, Beatriz Sousa-Santos, 2022, Does Size Matter? Exploring how Standard and Large-Scale Displays affect Off-site Experts during AR-Remote Collaboration, 9781450397193, 1, 10.1145/3531073.3534473 | |
39. | Athanasios Nikolaidis, What Is Significant in Modern Augmented Reality: A Systematic Analysis of Existing Reviews, 2022, 8, 2313-433X, 145, 10.3390/jimaging8050145 | |
40. | Bernardo Marques, Samuel Silva, Paulo Dias, Beatriz Sousa-Santos, 2022, Which notification is better? Comparing Visual, Audio and Tactile Cues for Asynchronous Mixed Reality (MR) Remote Collaboration: A User Study, 9781450398206, 276, 10.1145/3568444.3570587 | |
41. | Bernardo Marques, Samuel Silva, Joao Alves, Tiago Araujo, Paulo Dias, Beatriz Sousa Santos, A Conceptual Model and Taxonomy for Collaborative Augmented Reality, 2022, 28, 1077-2626, 5113, 10.1109/TVCG.2021.3101545 | |
42. | Bernardo Marques, António Teixeira, Samuel Silva, João Alves, Paulo Dias, Beatriz Sousa Santos, A critical analysis on remote collaboration mediated by Augmented Reality: Making a case for improved characterization and evaluation of the collaborative process, 2022, 102, 00978493, 619, 10.1016/j.cag.2021.08.006 | |
43. | Gilberto Ferreira, Marisa Pinheiro, J. C. Silva, 2022, Floating Music Score: System Design, 978-1-6654-5584-8, 9, 10.1109/ICCSA57511.2022.00012 | |
44. | Hyunju Kim, Jung-Min Park, Providing Dual Awareness using Multimodal Cues for Collaborative Manipulation in Virtual Environments, 2023, 1044-7318, 1, 10.1080/10447318.2023.2188533 | |
45. | Rafael Maio, André Santos, Bernardo Marques, Carlos Ferreira, Duarte Almeida, Pedro Ramalho, Joel Batista, Paulo Dias, Beatriz Sousa Santos, Pervasive Augmented Reality to support logistics operators in industrial scenarios: a shop floor user study on kit assembly, 2023, 127, 0268-3768, 1631, 10.1007/s00170-023-11289-1 | |
46. | Vuthea Chheang, Florian Heinrich, Fabian Joeres, Patrick Saalfeld, Bernhard Preim, Christian Hansen, Wim-Based Group Navigation for Collaborative Virtual Reality, 2022, 1556-5068, 10.2139/ssrn.4192624 | |
47. | Nelusha Nugegoda, Marium-E Jannat, Khalad Hasan, Patricia Lasserre, Exploring the Effect of Viewing Attributes of Mobile AR Interfaces on Remote Collaborative and Competitive Tasks, 2024, 30, 1077-2626, 7288, 10.1109/TVCG.2024.3456177 | |
48. | Simon Kreuzwieser, Anjela Mayer, Matthes Elstermann, Jivka Ovtcharova, 2023, Chapter 16, 978-3-658-40569-4, 267, 10.1007/978-3-658-40570-0_16 | |
49. | Bernardo Marques, Carlos Ferreira, Samuel Silva, André Santos, Andreia Santos, Paulo Dias, Beatriz Sousa Santos, Exploring different content creation and display methods for remote collaboration supported by eXtended reality: comparative analysis of distinct task scenarios, 2024, 1573-7721, 10.1007/s11042-024-19836-y | |
50. | Anjela Mayer, Jean-Rémy Chardonnet, Polina Häfner, Jivka Ovtcharova, 2023, Chapter 6, 978-3-031-26489-4, 87, 10.1007/978-3-031-26490-0_6 | |
51. | Bernardo Marques, Carlos Ferreira, Samuel Silva, Paulo Dias, Beatriz Sousa Santos, 2023, How to Notify Team Members during Asynchronous Remote Collaboration supported by Mixed Reality: Comparing Visual, Audio and Tactile Notifications, 9781450399241, 1, 10.1145/3565066.3608687 | |
52. | Bernardo Marques, Samuel Silva, António Teixeira, João Alves, Paulo Dias, Beatriz Sousa Santos, 2023, Towards Asynchronous Mixed Reality Remote Guidance supported by a Virtual Assistant: Proposal of a Conceptual Model, 979-8-3503-4839-2, 370, 10.1109/VRW58643.2023.00082 | |
53. | Yulu Liang, A Study on the Innovation of University English Teaching Mode by Integrating OBE and Mixed Reality Technology, 2024, 9, 2444-8656, 10.2478/amns-2024-2671 | |
54. | Lili Wang, Xiangyu Li, Jian Wu, Dong Zhou, Im Sio Kei, Voicu Popescu, AVICol: Adaptive Visual Instruction for Remote Collaboration Using Mixed Reality, 2024, 1044-7318, 1, 10.1080/10447318.2024.2313920 | |
55. | Karim Youssef, Sherif Said, Samer Al Kork, Taha Beyrouthy, Telepresence in the Recent Literature with a Focus on Robotic Platforms, Applications and Challenges, 2023, 12, 2218-6581, 111, 10.3390/robotics12040111 | |
56. | Anjela Mayer, Izel Kilinc, Kevin Sprügel, Polina Häfner, 2023, Chapter 47, 978-3-031-42466-3, 513, 10.1007/978-3-031-42467-0_47 | |
57. | Jessica Laura Bitter, Ulrike Spierling, 2025, Chapter 20, 978-3-031-74137-1, 281, 10.1007/978-3-031-74138-8_20 | |
58. | Maximilian Letter, Ceenu George, Katrin Wolf, 2023, Chapter 19, 978-3-031-42285-0, 346, 10.1007/978-3-031-42286-7_19 | |
59. | Bernardo Marques, Samuel Silva, Rafael Maio, Paulo Dias, Beatriz Sousa Santos, 2023, Chapter 34, 978-3-031-36003-9, 247, 10.1007/978-3-031-36004-6_34 | |
60. | Anjela Mayer, Augustin Rungeard, Jean-Rémy Chardonnet, Polina Häfner, Jivka Ovtcharova, 2023, Immersive Hand Instructions in AR – Insights for Asynchronous Remote Collaboration on Spatio-Temporal Manual Tasks, 979-8-3503-3636-8, 1, 10.1109/CIVEMSA57781.2023.10231018 | |
61. | Hibiki Esaki, Kosuke Sekiyama, Immersive Robot Teleoperation Based on User Gestures in Mixed Reality Space, 2024, 24, 1424-8220, 5073, 10.3390/s24155073 | |
62. | Ruiyu Liang, Chaoran Huang, Chengguo Zhang, Binghao Li, Serkan Saydam, Ismet Canbulat, Exploring the Fusion Potentials of Data Visualization and Data Analytics in the Process of Mining Digitalization, 2023, 11, 2169-3536, 40608, 10.1109/ACCESS.2023.3267813 | |
63. | Atieh Mahroo, Luca Greci, Marta Mondellini, Marco Sacco, Assessment of a mixed reality smart home controller: HoloHome pilot study on healthy adults, 2023, 27, 1359-4338, 2673, 10.1007/s10055-023-00834-8 | |
64. | Federico Manuri, Francesco De Pace, Andrea Sanna, 2024, Chapter 478, 978-3-031-23159-9, 346, 10.1007/978-3-031-23161-2_478 | |
65. | Janet G Johnson, Tommy Sharkey, Iramuali Cynthia Butarbutar, Danica Xiong, Ruijie Huang, Lauren Sy, Nadir Weibel, 2023, UnMapped: Leveraging Experts’ Situated Experiences to Ease Remote Guidance in Collaborative Mixed Reality, 9781450394215, 1, 10.1145/3544548.3581444 | |
66. | Atieh Mahroo, Daniele Spoladore, Angelo Davalli, 2023, An Ontology-Based Mixed Reality Application to Support Car Reconfiguration for Drivers with Disabilities, 979-8-3503-0080-2, 6, 10.1109/MetroXRAINE58569.2023.10405597 | |
67. | Bernardo Marques, Samuel Silva, Eurico Pedrosa, Fábio Barros, António Teixeira, Beatriz Sousa Santos, 2024, Towards Unlimited Task Coverage and Direct (far-off) Manipulation in eXtended Reality Remote Collaboration, 9798400703232, 745, 10.1145/3610978.3640737 | |
68. | Simon Kreuzwieser, Anjela Mayer, Matthes Elstermann, Jivka Ovtcharova, 2024, Chapter 16, 978-3-658-45669-6, 265, 10.1007/978-3-658-45670-2_16 | |
69. | Markos Souropetsis, Eleni A. Kyza, CompARe: Design and Development of a Gamified Augmented Reality Learning Environment for Cultural Heritage Sites, 2024, 1556-4673, 10.1145/3703917 | |
70. | Bernardo Marques, Samuel Silva, Beatriz Sousa Santos, 2023, Where do we stand on Ethics, Privacy, and Security for Scenarios of Remote Collaboration supported by eXtended Reality?, 979-8-3503-2891-2, 355, 10.1109/ISMAR-Adjunct60411.2023.00076 | |
71. | Zahra Borhani, Prashast Sharma, Francisco R. Ortega, Survey of Annotations in Extended Reality Systems, 2024, 30, 1077-2626, 5074, 10.1109/TVCG.2023.3288869 | |
72. | Niklas Osmers, Michael Prilla, 2020, Getting out of Out of Sight: Evaluation of AR Mechanisms for Awareness and Orientation Support in Occluded Multi-Room Settings, 9781450367080, 1, 10.1145/3313831.3376742 | |
73. | Mine Dastan, Michele Fiorentino, Antonio E. Uva, Precise Tool to Target Positioning Widgets (TOTTA) in Spatial Environments: A Systematic Review, 2024, 30, 1077-2626, 7020, 10.1109/TVCG.2024.3456206 | |
74. | Markus Deli Girik Allo, Elim Trika Sudarsi, Nilma Taula’bi, Development of English Travel Guide for Improving Local Tourism Services in Ollon Valley Tourist Attraction, Tana Toraja, 2024, 1, 3046-7527, 06, 10.69693/dcs.v1i2.9 |
References | Topic | Display Devices Used | Collaboration Setup |
CasarinPacqueriaud and Bechmann [25] | Construction | PC + HMD | Collocated |
Coppens and Mens [26] | Architectural modelling | HMD | Variable |
Cortés-Dávalos and Mendoza [27] | Layout Planning | HHD | Collocated |
CroftLuceroNeurnberger et al. [28] | Military Operations | HMD | Collocated |
DongBehzadanChen et al. [29] | Visualisation | HMD | Collocated |
ElvezioLingLiu et al. [30] | Urban Data Exploration | HMD | Collocated |
EtzoldGrimmSchweitzer et al. [31] | Construction | PC + HHD | Remote |
Flotyński and Sobociński [32] | Urban Design | Combination | Collocated |
GülUzun and Halıcı [23] | Design Planning | Combination | Variable |
IbayashiSugiuraSakamoto et al. [33] | Architecture Design | Others | Collocated |
LeonDoolanLaing et al. [34] | Computational Design | Touchscreen Display | Collocated |
LiNee and Ong [35] | FE Structural Analysis | HHD | Collocated |
LinLiuTsai et al. [20] | Construction Discussion | HHD + Public Display | Collocated |
NittalaLiCartwright et al. [36] | Field Operations | HHD + HMD | Remote |
PhanHönig and Ayanian [37] | Operations | HMD | Remote |
Rajeb and Leclercq [38] | Architectural Design | SAR | Variable |
RoKimByun et al. [39] | Architectural Design | SAR | Remote |
SchattelTönnisKlinker et al. [40] | Architectural Design | HHD | Collocated |
ShinNg and Saakes [41] | Interior Design | HHD | Collocated |
Singh and Delhi [42] | Layout Planning | HHD | Collocated |
TroutRussellHarrison et al. [43] | Military Operations | PC + HMD | Collocated |
References | Topic | Display Devices Used | Collaboration Setup |
AlhumaidanLo and Selby [44] | Learning | HHD | Collocated |
AlhumaidanLo and Selby [45] | Learning | HHD | Collocated |
BenavidesAmores and Maes [46] | Experiential learning | HMD | Remote |
Blanco-FernándezLópez-NoresPazos-Arias et al. [47] | Immersive learning, human history | HHD | Collocated |
BoyceRowanBaity et al. [48] | Military training | SAR | Collocated |
Bressler and Bodzin [49] | Learning, science forensic game | HHD | Collocated |
ChenFan and Wu [50] | Learning, horticultural science | HHD | Collocated |
DaiberKosmalla and Krüger [51] | Boulder training | HHD | Collocated |
DesaiBelmonteJin et al. [52] | Training, chemistry experiments | PC | Remote |
Fleck and Simon [53] | Learning, astronomy | SAR | Collocated |
Gazcón and Castro [54] | Learning | PC | Variable |
GelsominiKanevHung et al. [55] | Learning, Kanji language | HHD | Collocated |
GironacciMc-Call and Tamisier [56] | Storytelling, gamification | HHD + HMD | Collocated |
GoyalVijayMonga et al. [57] | Learning, programming | HHD | Collocated |
Greenwald [58] | Situated Learning | HHD + HMD | Remote |
HanJoHyun et al. [59] | Learning, dramatic play | PC | Collocated |
Iftene and Trandabăț [60] | Learning | HHD | Collocated |
Jyun-Fong and Ju-Ling [61] | Learning, local history | HHD | Collocated |
KangNoroozOguamanam et al. [62] | Embodied interaction | SAR | Collocated |
KazanidisPalaigeorgiouPapadopoulou et al. [63] | Learning, interactive videos | HHD + SAR | Collocated |
KeifertLeeDahn et al. [64] | Children behaviour during collaborative activities | SAR | Collocated |
Kim and Kim [65] | Learning, English education | HHD | Collocated |
KrstulovicBoticki and Ogata [66] | Learning | HHD | Collocated |
LeLe and Tran [67] | Learning | HHD + HMD | Collocated |
MacIntyreZhangJones et al. [68] | Learning, programming | SAR | Collocated |
MalinverniValeroSchaper et al. [69] | Embodied Learning | HHD | Collocated |
MaskottMaskott and Vrysis [70] | Learning, gamification | Combination | Collocated |
Pareto [71] | Learning, arithmetic games | Combination | Collocated |
PetersHeijligersde Kievith et al. [72] | Leadership training | HMD | Collocated |
PunjabiTung and Lin [73] | Learning by exploration | PC + HHD | Remote |
Rodríguez-VizzuettPérez-MedinaMuñoz-Arteaga et al. [74] | Learning | Others | Collocated |
Sanabria and Arámburo-Lizárraga [75] | Learning | Combination | Collocated |
ShaerValdesLiu et al. [76] | Experiential learning | Others | Collocated |
Shirazi and Behzadan [77] | Education, Construction | HHD | Collocated |
Shirazi and Behzadan [78] | Education, Construction | HHD | Collocated |
SunLiuZhang et al. [79] | Teaching | PC + HMD | Remote |
SunZhangLiu et al. [80] | Teaching | PC + HMD | Remote |
ThompsonLeavyLambeth et al. [81] | Education | HHD | Collocated |
WiehrKosmallaDaiber et al. [82] | Training, climbing | SAR | Collocated |
YangguangYue and Xiaodong [83] | Training | HHD | Collocated |
YoonWang and Elinich [84] | Learning | PC + SAR | Collocated |
ZubirSuryani and Ghazali [85] | Learning | HHD | Collocated |
References | Topic | Display Devices Used | Collaboration Setup |
Akahoshi and Matsushita [92] | Game | Others | Collocated |
BaillardFradetAlleaume et al. [93] | Media consumption | HHD + HMD | Collocated |
Baldauf and Fröhlich [94] | Media consumption | HHD + Public Display | Collocated |
BallagasDuganRevelle et al. [90] | Media consumption | HHD | Collocated |
BeimlerBruder and Steinicke [95] | Animation application | PC + HMD + SAR | Collocated |
BollamGothwalTejaswi V et al. [96] | Chess board game | HMD | Collocated |
BoonbrahmKaewrat and Boonbrahm [87] | 3D puzzle game | HHD | Remote |
BourdinSanahujaMoya et al. [97] | Entertainment, singing | HMD + CAVE | Remote |
BrondiAvvedutoAlem et al. [98] | 3D jigsaw puzzle game | HMD | Remote |
Ch'ngHarrison and Moore [99] | Interactive art | SAR | Collocated |
CourchesneDurand and Roy [100] | Interactive art | Others | Remote |
Dal CorsoOlsenSteenstrup et al. [101] | Game | SAR | Collocated |
DatcuLukosch and Lukosch [102] | Game | PC + HMD | Remote |
DatcuLukosch and Lukosch [103] | 3D block game | PC + HMD | Remote |
FigueroaHernándezMerienne et al. [104] | Game | PC + HMD | Variable |
FischbachLugrinBrandt et al. [105] | Board game | Tabletop | Collocated |
FischbachStriepeLatoschik et al. [106] | Board game | SAR | Collocated |
GüntherMüllerSchmitz et al. [107] | Chess board game | HHD + HMD | Collocated |
HuoWangParedes et al. [108] | Coin collection game | HHD | Collocated |
KarakottasPapachristouDoumanoqlou et al. [109] | Immersive game | HHD + HMD | Remote |
LantinOverstall and Zhao [110] | Media art | HMD | Collocated |
LoviskaKrauseEngelbrecht et al. [111] | Game | HMD | Collocated |
Mackamul and Esteves [112] | Game, match pairs | HHD + SAR | Collocated |
Margolis and Cornish [113] | Cinema production | Combination | Remote |
McGillWilliamson and Brewster [114] | Media consumption | HMD | Remote |
MechtleySteinRoberts et al. [115] | Media arts | SAR | Collocated |
PilliasRobert-Bouchard and Levieux [116] | Tangible video game | Others | Collocated |
Podkosova and Kaufmann [117] | Game | HMD | Variable |
PrinsGunkelStokking et al. [118] | Media consumption | PC + HMD | Remote |
ReillySalimianMacKay et al. [19] | Game, privacy and security | Tabletop + Public Display | Variable |
RostamiBexell and Stanisic [119] | Immersive performance | HMD | Remote |
SatoHwang and Koike [120] | Game | SAR | Collocated |
SpielmannSchusterGötz et al. [121] | Film making | HHD + HMD | Collocated |
TrottnowGötzSeibert et al. [122] | Cinema production | PC + HHD + HMD | Collocated |
Valverde and Cochrane [123] | Performing arts | Others | Variable |
Van Troyer [124] | Theatre performance | Others | Remote |
VermeerAlakade Bruin et al. [125] | Game, lasers | HHD | Collocated |
WegnerSeeleBuhler et al. [126] | Game | HMD | Collocated |
ZhouHagemannFels et al. [127] | 3D game and mental puzzle | Others | Collocated |
ZimmererFischbach and Latoschik [128] | Tabletop game | HHD + Tabletop | Collocated |
References | Topic | Display Devices Used | Collaboration Setup |
AbramoviciWolfAdwernat et al. [130] | Maintenance | HHD | Collocated |
AschenbrennerLiDukalski et al. [131] | Production Line Planning | HMD | Variable |
BednarzJamesWidzyk-Capehart et al. [132] | Mining Industry | Combination | Remote |
CapodieciMainetti and Alem [133] | Maintenance | HMD + Multitouch | Remote |
ChoiKim and Lee [134] | Industry | HHD | Remote |
ClergeaudRooHachet et al. [135] | Industry | HMD + Spatial | Remote |
DatcuCidotaLukosch et al. [136] | Inflight Maintenance | Combination | Remote |
DomovaVartiainen and Englund [137] | Industry | PC + HHD | Remote |
ElvezioSukanOda et al. [138] | Assembly, maintenance | HMD | Remote |
FunkKritzler and Michahelles [139] | Assembly | HMD | Collocated |
GalambosCsapóZentay et al. [140] | Manufacturing | Combination | Remote |
GalambosBaranyi and Rudas [141] | Manufacturing | Others | Remote |
GauglitzNuernbergerTurk et al. [129] | Car repair | PC + HHD | Remote |
GuptaUcler and Bernard [142] | New product development, Aviation industry | HMD | Remote |
GurevichLanir and Cohen [143] | Industry | PC + SAR | Remote |
GüntherKratzAvrahami et al. [144] | Industry | PC + HMD | Remote |
MorosiCarliCaruso et al. [145] | Product design | HHD + SAR | Collocated |
PlopskiFuvattanasilpPoldi et al. [146] | Maintenance | HHD | Remote |
SeoLeePark et al. [147] | Industry | Combination | Variable |
ZenatiHamidiaBellarbi et al. [148] | Maintenance | PC + HMD | Remote |
ZenatiBenbelkacemBelhocine et al. [149] | Maintenance | PC + HMD | Remote |
Zenati-HendaBellarbiBenbelkacem et al. [150] | Maintenance | HMD + Multitouch | Remote |
References | Topic | Display Devices Used | Collaboration Setup |
AlharthiSharmaSunka et al. [153] | Disaster Response | HHD + HMD | Collocated |
CarboneFreschiMascioli et al. [154] | Telemedicine | HMD | Remote |
ElvezioLingLiu et al. [155] | Rehabilitation | HMD | Variable |
DavisCanPindrik et al. [152] | Remote surgery | HHD | Remote |
GillisCalyamApperson et al. [156] | Response Team | HMD | Remote |
KurilloYangShia et al. [157] | Telemedicine | PC + HMD | Remote |
NunesNedel and Roesler [158] | Exercise game | Others | Remote |
NunesLucasSimõ es-Marques et al. [159] | Disaster Response | HHD | Variable |
PopescuLăptoiuMarinescu et al. [160] | Orthopaedic Surgery | HHD | Remote |
ShluzasAldaz and Leifer [161] | Telemedicine | HMD | Remote |
Sirilak and Muneesawang [162] | Telemedicine | HMD | Remote |
VassellAppersonCalyam et al. [163] | Response Team | HMD | Remote |
References | Topic | Display Devices Used | Collaboration Setup |
Camps-OrtuetaRodríguez-MuñozGómez-Martín et al. [166] | Museum visit | HHD | Collocated |
ChenLeeSwift et al. [167] | Scene exploration | PC + HMD | Remote |
GleasonFiannacaKneisel et al. [168] | Scene exploration | HHD + HMD | Collocated |
HuangKaminskiLuo et al. [169] | Museum visit | HHD | Collocated |
KallioniemiHeimonenTurunen et al. [170] | Scene exploration | SAR | Remote |
LiNittalaSharlin et al. [171] | Land exploration | HHD | Remote |
NuernbergerLienGrinta et al. [172] | Scene exploration | PC + HHD | Remote |
KallioniemiHakulinenKeskinen et al. [173] | Wayfinding | SAR | Remote |
References | Topic | Display Devices Used | Collaboration Setup |
CasarinPacqueriaud and Bechmann [25] | Construction | PC + HMD | Collocated |
Coppens and Mens [26] | Architectural modelling | HMD | Variable |
Cortés-Dávalos and Mendoza [27] | Layout Planning | HHD | Collocated |
CroftLuceroNeurnberger et al. [28] | Military Operations | HMD | Collocated |
DongBehzadanChen et al. [29] | Visualisation | HMD | Collocated |
ElvezioLingLiu et al. [30] | Urban Data Exploration | HMD | Collocated |
EtzoldGrimmSchweitzer et al. [31] | Construction | PC + HHD | Remote |
Flotyński and Sobociński [32] | Urban Design | Combination | Collocated |
GülUzun and Halıcı [23] | Design Planning | Combination | Variable |
IbayashiSugiuraSakamoto et al. [33] | Architecture Design | Others | Collocated |
LeonDoolanLaing et al. [34] | Computational Design | Touchscreen Display | Collocated |
LiNee and Ong [35] | FE Structural Analysis | HHD | Collocated |
LinLiuTsai et al. [20] | Construction Discussion | HHD + Public Display | Collocated |
NittalaLiCartwright et al. [36] | Field Operations | HHD + HMD | Remote |
PhanHönig and Ayanian [37] | Operations | HMD | Remote |
Rajeb and Leclercq [38] | Architectural Design | SAR | Variable |
RoKimByun et al. [39] | Architectural Design | SAR | Remote |
SchattelTönnisKlinker et al. [40] | Architectural Design | HHD | Collocated |
ShinNg and Saakes [41] | Interior Design | HHD | Collocated |
Singh and Delhi [42] | Layout Planning | HHD | Collocated |
TroutRussellHarrison et al. [43] | Military Operations | PC + HMD | Collocated |
References | Topic | Display Devices Used | Collaboration Setup |
AlhumaidanLo and Selby [44] | Learning | HHD | Collocated |
AlhumaidanLo and Selby [45] | Learning | HHD | Collocated |
BenavidesAmores and Maes [46] | Experiential learning | HMD | Remote |
Blanco-FernándezLópez-NoresPazos-Arias et al. [47] | Immersive learning, human history | HHD | Collocated |
BoyceRowanBaity et al. [48] | Military training | SAR | Collocated |
Bressler and Bodzin [49] | Learning, science forensic game | HHD | Collocated |
ChenFan and Wu [50] | Learning, horticultural science | HHD | Collocated |
DaiberKosmalla and Krüger [51] | Boulder training | HHD | Collocated |
DesaiBelmonteJin et al. [52] | Training, chemistry experiments | PC | Remote |
Fleck and Simon [53] | Learning, astronomy | SAR | Collocated |
Gazcón and Castro [54] | Learning | PC | Variable |
GelsominiKanevHung et al. [55] | Learning, Kanji language | HHD | Collocated |
GironacciMc-Call and Tamisier [56] | Storytelling, gamification | HHD + HMD | Collocated |
GoyalVijayMonga et al. [57] | Learning, programming | HHD | Collocated |
Greenwald [58] | Situated Learning | HHD + HMD | Remote |
HanJoHyun et al. [59] | Learning, dramatic play | PC | Collocated |
Iftene and Trandabăț [60] | Learning | HHD | Collocated |
Jyun-Fong and Ju-Ling [61] | Learning, local history | HHD | Collocated |
KangNoroozOguamanam et al. [62] | Embodied interaction | SAR | Collocated |
KazanidisPalaigeorgiouPapadopoulou et al. [63] | Learning, interactive videos | HHD + SAR | Collocated |
KeifertLeeDahn et al. [64] | Children behaviour during collaborative activities | SAR | Collocated |
Kim and Kim [65] | Learning, English education | HHD | Collocated |
KrstulovicBoticki and Ogata [66] | Learning | HHD | Collocated |
LeLe and Tran [67] | Learning | HHD + HMD | Collocated |
MacIntyreZhangJones et al. [68] | Learning, programming | SAR | Collocated |
MalinverniValeroSchaper et al. [69] | Embodied Learning | HHD | Collocated |
MaskottMaskott and Vrysis [70] | Learning, gamification | Combination | Collocated |
Pareto [71] | Learning, arithmetic games | Combination | Collocated |
PetersHeijligersde Kievith et al. [72] | Leadership training | HMD | Collocated |
PunjabiTung and Lin [73] | Learning by exploration | PC + HHD | Remote |
Rodríguez-VizzuettPérez-MedinaMuñoz-Arteaga et al. [74] | Learning | Others | Collocated |
Sanabria and Arámburo-Lizárraga [75] | Learning | Combination | Collocated |
ShaerValdesLiu et al. [76] | Experiential learning | Others | Collocated |
Shirazi and Behzadan [77] | Education, Construction | HHD | Collocated |
Shirazi and Behzadan [78] | Education, Construction | HHD | Collocated |
SunLiuZhang et al. [79] | Teaching | PC + HMD | Remote |
SunZhangLiu et al. [80] | Teaching | PC + HMD | Remote |
ThompsonLeavyLambeth et al. [81] | Education | HHD | Collocated |
WiehrKosmallaDaiber et al. [82] | Training, climbing | SAR | Collocated |
YangguangYue and Xiaodong [83] | Training | HHD | Collocated |
YoonWang and Elinich [84] | Learning | PC + SAR | Collocated |
ZubirSuryani and Ghazali [85] | Learning | HHD | Collocated |
References | Topic | Display Devices Used | Collaboration Setup |
Akahoshi and Matsushita [92] | Game | Others | Collocated |
BaillardFradetAlleaume et al. [93] | Media consumption | HHD + HMD | Collocated |
Baldauf and Fröhlich [94] | Media consumption | HHD + Public Display | Collocated |
BallagasDuganRevelle et al. [90] | Media consumption | HHD | Collocated |
BeimlerBruder and Steinicke [95] | Animation application | PC + HMD + SAR | Collocated |
BollamGothwalTejaswi V et al. [96] | Chess board game | HMD | Collocated |
BoonbrahmKaewrat and Boonbrahm [87] | 3D puzzle game | HHD | Remote |
BourdinSanahujaMoya et al. [97] | Entertainment, singing | HMD + CAVE | Remote |
BrondiAvvedutoAlem et al. [98] | 3D jigsaw puzzle game | HMD | Remote |
Ch'ngHarrison and Moore [99] | Interactive art | SAR | Collocated |
CourchesneDurand and Roy [100] | Interactive art | Others | Remote |
Dal CorsoOlsenSteenstrup et al. [101] | Game | SAR | Collocated |
DatcuLukosch and Lukosch [102] | Game | PC + HMD | Remote |
DatcuLukosch and Lukosch [103] | 3D block game | PC + HMD | Remote |
FigueroaHernándezMerienne et al. [104] | Game | PC + HMD | Variable |
FischbachLugrinBrandt et al. [105] | Board game | Tabletop | Collocated |
FischbachStriepeLatoschik et al. [106] | Board game | SAR | Collocated |
GüntherMüllerSchmitz et al. [107] | Chess board game | HHD + HMD | Collocated |
HuoWangParedes et al. [108] | Coin collection game | HHD | Collocated |
KarakottasPapachristouDoumanoqlou et al. [109] | Immersive game | HHD + HMD | Remote |
LantinOverstall and Zhao [110] | Media art | HMD | Collocated |
LoviskaKrauseEngelbrecht et al. [111] | Game | HMD | Collocated |
Mackamul and Esteves [112] | Game, match pairs | HHD + SAR | Collocated |
Margolis and Cornish [113] | Cinema production | Combination | Remote |
McGillWilliamson and Brewster [114] | Media consumption | HMD | Remote |
MechtleySteinRoberts et al. [115] | Media arts | SAR | Collocated |
PilliasRobert-Bouchard and Levieux [116] | Tangible video game | Others | Collocated |
Podkosova and Kaufmann [117] | Game | HMD | Variable |
PrinsGunkelStokking et al. [118] | Media consumption | PC + HMD | Remote |
ReillySalimianMacKay et al. [19] | Game, privacy and security | Tabletop + Public Display | Variable |
RostamiBexell and Stanisic [119] | Immersive performance | HMD | Remote |
SatoHwang and Koike [120] | Game | SAR | Collocated |
SpielmannSchusterGötz et al. [121] | Film making | HHD + HMD | Collocated |
TrottnowGötzSeibert et al. [122] | Cinema production | PC + HHD + HMD | Collocated |
Valverde and Cochrane [123] | Performing arts | Others | Variable |
Van Troyer [124] | Theatre performance | Others | Remote |
VermeerAlakade Bruin et al. [125] | Game, lasers | HHD | Collocated |
WegnerSeeleBuhler et al. [126] | Game | HMD | Collocated |
ZhouHagemannFels et al. [127] | 3D game and mental puzzle | Others | Collocated |
ZimmererFischbach and Latoschik [128] | Tabletop game | HHD + Tabletop | Collocated |
References | Topic | Display Devices Used | Collaboration Setup |
AbramoviciWolfAdwernat et al. [130] | Maintenance | HHD | Collocated |
AschenbrennerLiDukalski et al. [131] | Production Line Planning | HMD | Variable |
BednarzJamesWidzyk-Capehart et al. [132] | Mining Industry | Combination | Remote |
CapodieciMainetti and Alem [133] | Maintenance | HMD + Multitouch | Remote |
ChoiKim and Lee [134] | Industry | HHD | Remote |
ClergeaudRooHachet et al. [135] | Industry | HMD + Spatial | Remote |
DatcuCidotaLukosch et al. [136] | Inflight Maintenance | Combination | Remote |
DomovaVartiainen and Englund [137] | Industry | PC + HHD | Remote |
ElvezioSukanOda et al. [138] | Assembly, maintenance | HMD | Remote |
FunkKritzler and Michahelles [139] | Assembly | HMD | Collocated |
GalambosCsapóZentay et al. [140] | Manufacturing | Combination | Remote |
GalambosBaranyi and Rudas [141] | Manufacturing | Others | Remote |
GauglitzNuernbergerTurk et al. [129] | Car repair | PC + HHD | Remote |
GuptaUcler and Bernard [142] | New product development, Aviation industry | HMD | Remote |
GurevichLanir and Cohen [143] | Industry | PC + SAR | Remote |
GüntherKratzAvrahami et al. [144] | Industry | PC + HMD | Remote |
MorosiCarliCaruso et al. [145] | Product design | HHD + SAR | Collocated |
PlopskiFuvattanasilpPoldi et al. [146] | Maintenance | HHD | Remote |
SeoLeePark et al. [147] | Industry | Combination | Variable |
ZenatiHamidiaBellarbi et al. [148] | Maintenance | PC + HMD | Remote |
ZenatiBenbelkacemBelhocine et al. [149] | Maintenance | PC + HMD | Remote |
Zenati-HendaBellarbiBenbelkacem et al. [150] | Maintenance | HMD + Multitouch | Remote |
References | Topic | Display Devices Used | Collaboration Setup |
AlharthiSharmaSunka et al. [153] | Disaster Response | HHD + HMD | Collocated |
CarboneFreschiMascioli et al. [154] | Telemedicine | HMD | Remote |
ElvezioLingLiu et al. [155] | Rehabilitation | HMD | Variable |
DavisCanPindrik et al. [152] | Remote surgery | HHD | Remote |
GillisCalyamApperson et al. [156] | Response Team | HMD | Remote |
KurilloYangShia et al. [157] | Telemedicine | PC + HMD | Remote |
NunesNedel and Roesler [158] | Exercise game | Others | Remote |
NunesLucasSimõ es-Marques et al. [159] | Disaster Response | HHD | Variable |
PopescuLăptoiuMarinescu et al. [160] | Orthopaedic Surgery | HHD | Remote |
ShluzasAldaz and Leifer [161] | Telemedicine | HMD | Remote |
Sirilak and Muneesawang [162] | Telemedicine | HMD | Remote |
VassellAppersonCalyam et al. [163] | Response Team | HMD | Remote |
References | Topic | Display Devices Used | Collaboration Setup |
Camps-OrtuetaRodríguez-MuñozGómez-Martín et al. [166] | Museum visit | HHD | Collocated |
ChenLeeSwift et al. [167] | Scene exploration | PC + HMD | Remote |
GleasonFiannacaKneisel et al. [168] | Scene exploration | HHD + HMD | Collocated |
HuangKaminskiLuo et al. [169] | Museum visit | HHD | Collocated |
KallioniemiHeimonenTurunen et al. [170] | Scene exploration | SAR | Remote |
LiNittalaSharlin et al. [171] | Land exploration | HHD | Remote |
NuernbergerLienGrinta et al. [172] | Scene exploration | PC + HHD | Remote |
KallioniemiHakulinenKeskinen et al. [173] | Wayfinding | SAR | Remote |