Loading [Contrib]/a11y/accessibility-menu.js
Communication

A systematic review of 3D cursor in the medical literature

  • Received: 01 January 2018 Accepted: 15 March 2018 Published: 20 March 2018
  • The term 3D cursor has been used in the computer graphics industry for quite some time; however, in recent years, it has also been used in the medical field. In medicine, the term 3D cursor has been used to describe a user’s hands, hand-controllers, a 2D cursor that can travel in 3D space, and a volume-subtending 3D cursor. In this article, we perform a systematic review of the medical literature of the term “3D cursor” and discuss the applications in the fields of diagnostic radiology and surgery. We discuss the important applications of the 3D cursor the use of a 3D cursor in combination with virtual reality (VR) and augmented reality (AR) in medicine.

    Citation: David B. Douglas, Robert E. Douglas, Cliff Wilke, David Gibson, John Boone, Max Wintermark. A systematic review of 3D cursor in the medical literature[J]. AIMS Electronics and Electrical Engineering, 2018, 2(1): 1-11. doi: 10.3934/ElectrEng.2018.1.1

    Related Papers:

    [1] Tobias Müller . Challenges in representing information with augmented reality to support manual procedural tasks. AIMS Electronics and Electrical Engineering, 2019, 3(1): 71-97. doi: 10.3934/ElectrEng.2019.1.71
    [2] Rebecca M. Hein, Carolin Wienrich, Marc E. Latoschik . A systematic review of foreign language learning with immersive technologies (2001-2020). AIMS Electronics and Electrical Engineering, 2021, 5(2): 117-145. doi: 10.3934/electreng.2021007
    [3] Ryan Anthony J. de Belen, Huyen Nguyen, Daniel Filonik, Dennis Del Favero, Tomasz Bednarz . A systematic review of the current state of collaborative mixed reality technologies: 2013–2018. AIMS Electronics and Electrical Engineering, 2019, 3(2): 181-223. doi: 10.3934/ElectrEng.2019.2.181
    [4] Patrick Seeling . Augmented Reality Device Operator Cognitive Strain Determination and Prediction. AIMS Electronics and Electrical Engineering, 2017, 1(1): 100-110. doi: 10.3934/ElectrEng.2017.1.100
    [5] Fadhil A. Hasan, Lina J. Rashad . Combining fractional-order PI controller with field-oriented control based on maximum torque per ampere technique considering iron loss of induction motor. AIMS Electronics and Electrical Engineering, 2024, 8(3): 380-403. doi: 10.3934/electreng.2024018
    [6] Deven Nahata, Kareem Othman . Exploring the challenges and opportunities of image processing and sensor fusion in autonomous vehicles: A comprehensive review. AIMS Electronics and Electrical Engineering, 2023, 7(4): 271-321. doi: 10.3934/electreng.2023016
    [7] Muamer M. Shebani, M. Tariq Iqbal, John E. Quaicoe . Comparison between alternative droop control strategy, modified droop method and control algorithm technique for parallel-connected converters. AIMS Electronics and Electrical Engineering, 2021, 5(1): 1-23. doi: 10.3934/electreng.2021001
    [8] Reza K. Amineh . Developments in three-dimensional near-field imaging with FMCW radar: A comparative study. AIMS Electronics and Electrical Engineering, 2020, 4(4): 359-368. doi: 10.3934/ElectrEng.2020.4.359
    [9] Folasade M. Dahunsi, Abayomi E. Olawumi, Daniel T. Ale, Oluwafemi A. Sarumi . A systematic review of data pre-processing methods and unsupervised mining methods used in profiling smart meter data. AIMS Electronics and Electrical Engineering, 2021, 5(4): 284-314. doi: 10.3934/electreng.2021015
    [10] Dave Bullock, Aliyu Aliyu, Leandros Maglaras, Mohamed Amine Ferrag . Security and privacy challenges in the field of iOS device forensics. AIMS Electronics and Electrical Engineering, 2020, 4(3): 249-258. doi: 10.3934/ElectrEng.2020.3.249
  • The term 3D cursor has been used in the computer graphics industry for quite some time; however, in recent years, it has also been used in the medical field. In medicine, the term 3D cursor has been used to describe a user’s hands, hand-controllers, a 2D cursor that can travel in 3D space, and a volume-subtending 3D cursor. In this article, we perform a systematic review of the medical literature of the term “3D cursor” and discuss the applications in the fields of diagnostic radiology and surgery. We discuss the important applications of the 3D cursor the use of a 3D cursor in combination with virtual reality (VR) and augmented reality (AR) in medicine.


    1. Introduction

    In the United States, the total cost for diagnostic medical imaging was estimated to be $100 billion in 2006 alone [1] and utilization rates of medical imaging are increasing[2,3,4]. Similarly, the augmented reality (AR)/ virtual reality (VR) industry is large, and is increasing rapidly. The worldwide revenues from the AR and VR markets are projected to increase from $5 billion in 2016 to $162 billion in 2020 [5].

    It is forseeable that these two fields of AR/VR and diagnostic medical imaging will merge in the near term since AR/VR provides true depth perception, fly through viewing, an improved human machine interface (HMI) and other augmented reality features that will likely enhance the diagnostic and therapeutic utility of diagnostic imaging [6,7,8,9,10]. As these two fields merge, one key component that deserves specific attention is the 3D cursor, since it will provide efficient navigation through structures.

    While the concept of a 3D cursor has been discussed in the computer graphics industry for quite some time [11], it is infrequently discussed in the medical literature. Furthermore, there is no published review article on the use of a 3D cursor in the medical literature to date. The purpose of this paper is to provide the reader with a systematic review the literature on the use of a 3D cursor in medicine.


    2. Materials and method

    With the consultation of an experienced health sciences librarian, we conducted a systematic review in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [12] in January 2018 with the search term "3D cursor" using PubMed, the Association for Computing Machinery (ACM) Digital Library and the Electrical and Electronics Engineers (IEEE) Digital Library. Two authors (D.D. and R.D.) reviewed the abstracts of all of the search results to determine relevance to medicine. Articles deemed relevant to medical imaging were downloaded and reviewed in their full text format. Finally, the uses of the term "3D cursor" in medicine were discussed and categorized.


    3. Results

    The PubMed search yielded a total of 6 articles [8,9,13,14,15,16], all of which had relevance to the medical field. The ACM Digital Library search yielded a total of 16 references [17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32], one [29] of which had relevance to the medical field. The IEEE Digital Library yielded a total of 19 references [33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51], three [36,45,48] of which had relevance to the medical field. The citations were imported into EndNote 7 (Thomson Reuters, New York, NY). Thus, a total of 31 references did not have relevance to the medical field and a total of 10 referenced did have relevance to the medical field. These references fell into three categories: 1. hands or hand held controller; 2. non-volume-subtending digital 2D cursor capable of moving in all three axes. 3. volume-subtending digital 3D cursor capable of moving in all 3 axes. See Figure 1.

    Figure 1. This figure outlines the flow chart for article selection used in this study. Out of the 41 total references using the search term "3D cursor" in PubMed, the Association for Computing Machinery (ACM) Digital Library and the Electrical and Electronics Engineers (IEEE) Digital Library, only 10 had relevance to the field of medicine. These 10 were subsequently placed into three categories: "hands" or "hand-held controller" [13,16,29]; "non-volume subtending 3D cursor moving in 3D space" [14,15,45,48]; and, "volume subtending 3D cursor" [8,36,52].

    3.1. 3D cursor referring to "hands" or "hand-held controller"

    In 1996, the term "3D cursor" was first used in the medical literature to represent a hand-held controller. Wong and colleagues discuss their method of using a hand-held controller as a 3D cursor to select particular volumes of interest [16]. Their setup included wireless glasses synchronized to rapidly alternating images for the left eye and right eye displayed on a cathode ray tube with the observer holding the controller in his hand. The 3D cursor is a tangible object used to select particular regions within the volume of interest. See Figure 2.

    Figure 2. This figure illustrates the operator holding a tangible object referred to as the "3D cursor". Reprinted from Radiology, Volume 198, by Wong et al. "Stereoscopically guided characterization of three dimensional dynamic MR images of the breast", 288–291, 1996 with permission [16].

    Fifteen years later, Jerald and Yoganandan developed an interactive medical visualization system, which uses an advanced hand-held controller system [29]. The uniqueness of their system is that the viewpoint of the images is controlled by two hand-held controllers with each controller manipulating a volume rendering image. These authors use the terminology "3D cursor" to denote the user's hands holding the hand-held controllers. This yielded a positive result with faster response time in the placement of the 3D object, which was 4.5–4.7 times as fast as a mouse interface and 1.3–1.7 times as fast as a one-handed wand.

    Recently, in the 2017 article, "On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface, " Lopes and colleagues discussed the limitations of conventional 2D input devices including mice and keyboards, which hamper 3D analysis [13]. They discussed the fact that several attempts with the conventional 2D input devices (i.e., mouse and keyboard) may be needed to achieve the desired orientation needed for 3D analysis. Furthermore, if the user is a surgeon, sterile precautions must be maintained adding an additional challenge in achieving the desired orientation. Lopes and colleagues therefore developed a touchless interface, which could interpret simple hand gestures to facilitate manipulation of 3D medical data wherein each human hand acts as a 3D cursor. In this study, the touchless interface improved spatial awareness and fluent interaction with the 3D volume more than with traditional 2D input devices. See Figure 3.

    Figure 3. This figure illustrates the operator holding his hand up to manipulate the medical image and in this case, his hand is referred to as the "3D cursor". Reprinted from the Journal of Biomedical Informatics, Lopes et al. "On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface", Volume 72,140–149, 2017 with permission [13].

    3.2. 3D cursor referring to digital cursor capable of moving in all three axes

    In 1995, the term "3D cursor" was first used to denote a cursor that could move in 3D space in the article "Interactive delineation of brain sulci and their merging into functional PET images" by Michel et al. Michel and colleagues developed software tools to assist the neurologist and physiologist in analysis of cerebral images with structural magnetic resonance imaging (MRI) and functional positron emission tomography (PET) imaging [45]. Once the two datasets are coregistered, the authors built a 3D trace line for each sulcus of the brain, which helps one follow the brain surface curvature. The 3D cursor successfully provided the position in 3D space to link the surface viewer with the volume.

    Five years later, Goodsitt and colleagues evaluated depth perception using a virtual 3D cursor in mammography [15]. In this study, three observers wore stereo glasses and estimated the depth of a fibril by adjusting the position of the cross-shaped virtual cursor on the computer system. All three observers were able to accurately estimate the depth of vertically oriented objects, but only one was able to accurately estimate the depth of the horizontally oriented object. The authors felt that this was related to the varying degree of observers' aptitude of stereoscopic visualization. A figure of the 3D cursor was not provided in this article.

    It wasn't until the year 2013 until the term "3D cursor" was used in this similar fashion. In this study, Wang and colleagues explored a brain-computer interface (BCI) technology in a subject with tetraplegia with the surgically placed electrode inside of the skull at the sensorimotor cortex of the brain[14]. Wang and colleagues assessed whether the subject using the BCI was able to move a 2D and a 3D cursor over a period of 30 days. Their 2D cursor is said to move in the x-y plane and their 3D cursor includes movement in the z-direction.

    Most recently, in 2015, Eagleson and colleagues used the term "3D cursor" in a similar to the other three groups in their developing a prototype system for pre-operative planning [48]. Their NeuroTable system, a haptic device to facilitate control over a 3D cursor, which allowed users to target specific points within the pre-operative 3D scan.


    3.3. Volume-subtending digital 3D cursor capable of moving in all 3 axes

    In 2015, Moreira and colleagues generated a web-based image annotation tool for markup of radiology images in volumetric datasets using axial, frontal and sagittal planes [36]. The authors evaluated radiologists' preference of 3 options for marking a 3D region of interest including: pixel joining by similarity; border detection by imaging features; and, spheric shaped 3D cursor. Moreira et al. found that preferred option by the 6 radiologists for the best interface for marking 3D ROIs was the spheric 3D cursor. While not explicitly stated by the authors, this is noted to be a volume-subtending 3D cursor. See Figure 4.

    Figure 4. This figure illustrates the spheric ROI used by Moreira et al. with its projections on the axial (top left), coronal (top right) and sagittal (bottom left planes). Reprinted from IEEE 28th International Symposium on Computer-Based Medical Systems, Moreira et al. "3D Markup of Radiological Images in ePAD, a Web-Based Image Annotation Tool", pg 101 with permission.

    In 2016–2017, Douglas and colleagues designed a 3D cursor in which the 3D cursor itself subtends a volume and is embedded into the volumetric medical imagery [8,9]. Similar to the non-volume subtending 3D cursor, this cursor can also be moved anywhere in 3D space. It is denoted in red, but other false colors are possible. This 3D cursor was generated as part of the depth-3-dimensional (D3D) AR/VR imaging system, which provides stereoscopic imaging with depth perception and head tracking. The primary purpose for the creation of this cursor was to improve visualization of certain 3D anatomical structures. See Figures 56.

    Figure 5. This figure illustrates a stereo pair of images from DXC Technology (formerly Hewlett-Packard Enterprise) and D3D Enterprise showing the cerebral vasculature, which would be viewed with either VR or AR. Note the image on the left is left eye viewing perspective and the image on the right is the right eye viewing perspective, which provides depth perception. The red boxes represent the 3D cursors used. The white arrows point to middle cerebral artery branches, which course at varying depths. Reprinted from the Biology, Engineering and Medicine, Douglas et al. "Augmented reality: advances in surgery", Volume 72,140–149, 2017 with permission.
    Figure 6. (A) Contrast-enhanced breast CT demonstrates the mass with small spiculations extending from the margins. (B & C) Same mass from breast CT exam as seen in (A), but viewed with depth-3-deimensional (D3D) where (B) represents the left eye viewing perspective (LEVP) and (C) represents the right eye viewing perspective (REVP). The red box illustrates the 3D cursor used. (D & E) represent the same mass from the breast CT, but zoomed in and viewed from a different perspective with (D) representing the LEVP and (E) representing the REVP. Red arrows show spiculations extending from the margins of the mass. The red circle represents a spiculation sticking out toward the user, which was well seen when rotating with the D3D system. Reprinted from the Journal of Nature and Science, Douglas et al. "Augmented reality: imaging system: 3D viewing of a breast cancer", Volume 2,215, 2016 with permission.

    4. Discussion

    The term "3D cursor" has been used 10 times in the medical literature, but refers to three distinct topics: "hands" or "hand-held controller"; "non-volume subtending cursor moving in 3D space"; and, "volume-subtending 3D cursor that can move in 3D space". The role of 3D cursors may expand in years to come.

    Surgeons must be able to reference medical images throughout an operation; thus, operating rooms are equipped with radiology picture archiving communication systems (PACS). However, since surgeons must maintain sterility when operating, they have two options to view patient images: scrub out of the surgery to manipulate and view images; or, have an assistant who is not scrubbed manipulate the PACS to show the desired image. Both of these options take valuable time in the operating room and a better solution is needed. The concept of the surgeon's hands representing the "3D cursor" as highlighted by Lopes et al. [13] has a important role in improving care, as this would allow the surgeon to control images while maintaining sterility. In addition to the use of "hands" as a 3D cursor, the use of real-world interface props, which can be manipulated by the user to specify spatial relationships, have specific advantages.

    Surgical props can be equipped with orientation trackers; thus, humans may be able to more naturally interface with the virtual world. Such technology of interface props with orientation tracking has been discussed for some time including research in neurosurgery where a head viewing prop, a cutting-plane selection prop, and a trajectory selection prop were used [53]. Manufacturers could make "hand-held controllers" sterile as well, which would be advantageous to surgeons and providing an efficient method to view key images throughout an operation. The term "hand-held controllers" type of 3D cursor may also have a role in diagnostic radiology in image manipulation; however, there is an additional challenge that radiologists face.

    Medical imaging datasets are extremely large due to the advances in spatial resolution of computed tomography (CT) and magnetic resonance imaging (MRI). As an example, an axial chest CT has a matrix of 512 × 512 pixels with 500+ sub-1mm-thick slices. Thus, there are over 130 million voxels generated in a chest CT. A 3D cursor capable of moving efficiently to any voxel within the dataset would yield improved efficiency in conventional "slice-by-slice" viewing of medical images, especially since a radiologist needs to pass through the volume several times in different window-level settings for a comprehensive review.

    Looking into the future of radiology with AR/VR viewing, 3D cursors are expected to play a role. AR/VR D3D viewing offers depth perception, head tracking, joystick fly-through, focal point convergence, false coloring, transparency adjustment, which overall generates improved HMI [6]. Several imaging scenarios have been explored including: discerning certain patterns of microcalcifications (branching vs. cluster) [9]; breast cancer [8]; lung cancer [54]; and, brain aneurysm [52]. The 3D cursor plays an essential role in not only efficient maneuvering through the human body, but also in highlighting particular areas of concern. Many abnormalities in the human body are subtle, but important. Raising attention to and communication of these through use of a 3D cursor may prove life saving.


    5. Conclusion

    Three distinct concepts of a 3D cursor have been discussed in the medical literature. One concept is the use of an individual's hands or a hand-controller to input commands into the computer, which has the potential key advantage of allowing a surgeon to control an image while maintaining sterility. Another concept is a digital non-volume-subtending 3D cursor, which can move in any of the 3 dimensions, which has the advantage of providing efficient movement through large volumetric datasets. The third concept is a volume-subtending 3D cursor, which has the advantages of both being able to move within a large volumetric dataset efficient and highlight particular volumes of interest.


    Conflict of interest

    Author R.E.D. has a direct financial interest in D3D Technologies. The author D.B.D. is a family member of R.E.D. Authors C.W. and D.G. are employees of DXC Technologies. Authors J.B. and M.W. have no financial interest.


    [1] Mitchell JM, LaGalia RR (2009) Controlling the escalating use of advanced imaging: the role of radiology benefit management programs. Med Care Res Rev 66: 339–351. doi: 10.1177/1077558709332055
    [2] Mettler FA, Jr., Wiest PW, Locken JA, et al. (2000) CT scanning: patterns of use and dose. J Radiol Prot 20: 353–359. doi: 10.1088/0952-4746/20/4/301
    [3] Mitchell DG, Parker L, Sunshine JH, et al. (2002) Body MR imaging and CT volume: variations and trends based on an analysis of medicare and fee-for-service health insurance databases. Am J Roentgenol 179: 27–31. doi: 10.2214/ajr.179.1.1790027
    [4] Boone JM, Brunberg JA (2008) Computed tomography use in a tertiary care university hospital. J Am Coll Radiol 5: 132–138. doi: 10.1016/j.jacr.2007.07.008
    [5] MEREL T (2015) The 7 drivers of the $150 billion AR/VR industry. Aol Tech.
    [6] Douglas D (2013) US 8,384,771 Method and Apparatus for Three Dimensional Viewing of Images. USA: US Patent Office.
    [7] Douglas D (2016) US 9,349,183 Method and Apparatus for Three Dimensional Viewing of Images. USA: US Patent Office.
    [8] Douglas DB, Boone JM, Petricoin E, et al. (2016) Augmented Reality Imaging System: 3D Viewing of a Breast Cancer. J Nat Sci 2.
    [9] Douglas DB, Petricoin EF, Liotta L, et al. (2016) D3D augmented reality imaging system: proof of concept in mammography. Med Devices (Auckl) 9: 277–283.
    [10] Douglas DB, Wilke CA, Gibson JD, et al. (2017) Augmented Reality: Advances in Diagnostic Imaging. Multimodal Technologies and Interaction 1: 29. doi: 10.3390/mti1040029
    [11] Butts DR, McAllister DF (1988) Implementation of true 3D cursors in computer graphics. SPIE Proc 902: Three-Dimensional Imaging and Remote Sensing Imaging (January 1988): 74–84.
    [12] Moher D, Liberati A, Tetzlaff J, et al. (2009) Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS medicine 6: e1000097. doi: 10.1371/journal.pmed.1000097
    [13] Lopes DS, de Figueiredo Parreira PD, Paulo SF, et al. (2017) On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface. J biomed inform 72: 140–149. doi: 10.1016/j.jbi.2017.07.009
    [14] Wang W, Collinger JL, Degenhart AD, et al. (2013) An electrocorticographic brain interface in an individual with tetraplegia. PloS one 8: e55344. doi: 10.1371/journal.pone.0055344
    [15] Goodsitt MM, Chan HP, Hadjiiski L (2000) Stereomammography: Evaluation of depth perception using a virtual 3D cursor. Mel phys 27: 1305–1310.
    [16] Wong TZ, Lateiner JS, Mahon TG, et al. (1996) Stereoscopically guided characterization of three-dimensional dynamic MR images of the breast. Radiology 198: 288–291. doi: 10.1148/radiology.198.1.8539396
    [17] Park S, Kim S, Park J (2012) Select ahead: efficient object selection technique using the tendency of recent cursor movements. Asia-Pacific Computer and Human Interaction: 51–58.
    [18] Katzakis N, Kiyokawa K, Takemura H (2013) Plane-casting: 3D cursor control with a smartphone. Asia-Pacific Computer and Human Interaction: 199–200.
    [19] Hudson SE (1992) The interaction technique notebook: Adding shadows to a 3D cursor. ACM Transactions on Graphics (TOG) 11: 193–199. doi: 10.1145/130881.370599
    [20] Dorta T, Kinayoglu G, Hoffmann M (2015) Hyve-3D and rethinking the 3D cursor: unfolding a natural interaction model for remote and local co-design in VR. International Conference on Computer Graphics and Interactive Techniques: 43.
    [21] Biocca F, Tang A, Owen C, et al. (2006) Attention funnel: omnidirectional 3D cursor for mobile augmented reality platforms. Human Factors in Computing Systems: 1115–1122.
    [22] Jung T, Bauer P (2017) Constraint-based modeling technique for mid-air interaction. Symposium on Spatial User Interaction: 157–157.
    [23] Feng J, Wartell Z (2014) Riding the plane: bimanual, desktop 3D manipulation. User Interface Software and Technology: 93–94.
    [24] Brewer J, Anderson D (1976) Techniques for interactive three dimensional design. International Conference on Computer Graphics and Interactive Techniques: 13–30.
    [25] Venolia D (1993) Facile 3D direct manipulation. Human Factors in Computing Systems: 31–36.
    [26] Teather RJ, Stuerzlinger W (2012) A system for evaluating 3D pointing techniques. Virtual Reality Software and Technology: 209–210.
    [27] Elmqvist N (2005) BalloonProbe: Reducing occlusion in 3D using interactive space distortion. Virtual Reality Software and Technology: 134–137.
    [28] Brewer JA, Anderson DC (1977) Visual interaction with overhauser curves and surfaces. International Conference on Computer Graphics and Interactive Techniques 11: 132–137.
    [29] Jerald J, Yoganandan A (2011) iMedic: immersive medical environment for distributed interactive consultation. International Conference on Computer Graphics and Interactive Techniques: 99–99.
    [30] Menelas B-AJ (2013) Interactive analysis of cavity-flows in a virtual environment. Spring Conference on Computer Graphics: 31–37.
    [31] Serrar Z, Elmarzouqi N, Jarir Z, et al. (2014) Evaluation of Disambiguation Mechanisms of Object-Based Selection in Virtual Environment: Which Performances and Features to Support "Pick Out"? International Conference on Human-Computer Interaction: 29.
    [32] Ware C, Lowther K (1997) Selection using a one-eyed cursor in a fish tank VR environment. ACM Transactions on Computer-Human Interaction (TOCHI) 4: 309–322. doi: 10.1145/267135.267136
    [33] Biocca F, Tang A, Owen C, et al. (2006) The omnidirectional attention funnel: A dynamic 3D cursor for mobile augmented reality systems. Hawaii International Conference on System Sciences 1: 22c–22c.
    [34] Kadri A, Lécuyer A, Burkhardt J-M, et al. (2007) The Influence of Visual Appearance of User's Avatar on the Manipulation of Objects in Virtual Environments. IEEE Virtual Reality Conference: 291–292.
    [35] Young TS, Teather RJ, MacKenzie IS (2017) An arm-mounted inertial controller for 6DOF input: Design and evaluation. Symposium on 3D User Interfaces: 26–35.
    [36] Moreira DA, Hage C, Luque EF, et al. (2015) 3D markup of radiological images in ePAD, a web-based image annotation tool. Computer-Based Medical Sytems: 97–102.
    [37] Jáuregui DAG, Argelaguet F, Lecuyer A (2012) Design and evaluation of 3D cursors and motion parallax for the exploration of desktop virtual environments. Symposium on 3D User Interfaces: 69–76.
    [38] Kadri A, Lécuyer A, Burkhardt J-M (2007) The visual appearance of user's avatar can influence the manipulation of both real devices and virtual objects. Symposium on 3D User Interfaces: 11.
    [39] Wither J, Höllerer T (2005) Pictorial depth cues for outdoor augmented reality. International Symposium on Wearable Computers: 92–99.
    [40] Wither J, Höllerer T (2004) Evaluating techniques for interaction at a distance. International Symposium on Wearable Computers 1: 124–127.
    [41] Wu S-T, Abrantes M, Tost D, et al. (2003) Picking and snapping for 3d input devices. Brazilian Symposium on Computer Graphics and Image Processing: 140–147.
    [42] Schwartz AB, Tillery SH, Taylor DM (2003) Cortical control of natural arm movement. International IEEE/EMBS Conference on Neural Engineering: 99.
    [43] Adachi Y (1993) Touch and trace on the free-form surface of virtual object. IEEE Virtual Reality Conference: 162–168.
    [44] Stein T, Coquillart S (2000) The metric cursor. Pacific Conference on Computer Graphics and Applications: 381–386.
    [45] Michel C, Sibomana M, Bodart J-M, et al. (1995) Interactive delineation of brain sulci and their merging into functional PET images. Nuclear Science Symposium and Medical Imaging Conference 3: 1480–1484.
    [46] Ernst H, Petzold J, Larice R, et al. (1996) Mixing of computer graphics and high-quality stereographic video. IEEE transactions on consumer electronics 42: 795–799. doi: 10.1109/30.536187
    [47] Özacar K, Hincapié-Ramos JD, Takashima K, et al. (2016) 3D Selection Techniques for Mobile Augmented Reality Head-Mounted Displays. Interact Comput 29: 579–591.
    [48] Eagleson R, Wucherer P, Stefan P, et al. (2015) Collaborative table-top VR display for neurosurgical planning. IEEE Virtual Reality Conference: 169–170.
    [49] Ernst H, Petzold J, Larice R, et al. (1996) High-quality overlay of live stereo video on computer graphics. International Conference on Consumer Electronics: 404.
    [50] Taylor DM (2007) The importance of online error correction and feed-forward adjustment in brain-machine interfaces for restoration of movement. Toward Brain-computer Interfacing: 161.
    [51] Dang N-T (2007) A survey and classification of 3D pointing techniques. IEEE International Conference on Research, Innovation and Vision for the Future: 71–80.
    [52] Douglas DB, Wilke CA, Gibson D, et al. (2017) Virtual reality and augmented reality: Advances in surgery. Biol 2: 1–8.
    [53] Hinckley K, Pausch R, Goble JC, et al. (1994) Passive real-world interface props for neurosurgical visualization. Human Factors in Computing Systems: 452–458.
    [54] David Douglas MDCW, M.S.; David Gibson, M.S.; Emanuel Petricoin, Ph.D.; Lance Liotta, Ph.D.; Demetri Venets, B.S.; Buddy Beck, M.B.A.; Robert Douglas, Ph.D. (2018) Depth-3-Dimensional (D3D) Augmented Reality Viewing of a Lung Cancer Imaged with PET: Proof of Concept. SNMMI Mid-Winter Meeting 2018. Orlando, FL.
  • This article has been cited by:

    1. Vladimir Saveljev, Jung-Young Son, Choonsik Yim, Gwanghee Heo, Three-dimensional interactive cursor based on voxel patterns for autostereoscopic displays, 2022, 23, 1598-0316, 137, 10.1080/15980316.2022.2029591
    2. Rohana Abdul Karim, Muhamad Hamizi Zaidi Bin Mohd Jonhanis, Wan Nur Azhani Binti W. Samsudin, Nurul Wahidah Arshad, Nor Farizan Zakaria, 2022, Comparative Study for Cursor Detection at Endoscopic Images for Telepointer, 978-1-6654-7098-8, 203, 10.1109/ICSPC55597.2022.10001816
  • Reader Comments
  • © 2018 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(5473) PDF downloads(993) Cited by(2)

Article outline

Figures and Tables

Figures(6)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog