The growing interest of Augmented Reality (AR) together with the renaissance of
Virtual Reality (VR) has opened new approaches and techniques on how professionals interact with medical imagery, plan, train and perform surgeries and also help people with special needs in Rehabilitation tasks. Indeed, many medical specialties already rely on 2D and 3D image data for diagnosis, surgical planning, surgical navigation, medical education or patient-clinician communication.
However, the vast majority of current medical interfaces and interaction techniques continue
unchanged, while the most innovative solutions have not unleashed the full potential of VR and
AR. This is probably because extending conventional workstations to accommodate VR and AR
interaction paradigms is not free of challenges. Notably, VR and AR-based workstations,
besides having to render complex anatomical data in interactive frame rates, must promote
proper anatomical insight, boost visual memory through seamless visual collaboration between
professionals, free interaction from being seated at a desk (e.g., using mouse and
keyboard) to adopt non-stationary postures and freely walk within a workspace, and must also
support a fluid exchange of image data and 3D models as this fosters interesting discussions
to solve clinical cases. Moreover, VR and AR-based techniques must also be designed
according to good human-computer interaction principles since it is well known that medical
professionals can be resistant to changes in their workflow. In this course, we will survey recent approaches to healthcare, including diagnosis, surgical training, planning, and followup as well as AR/MR/VR tools for patient rehabilitation. We discuss challenges, techniques, and principles in applying Extended Reality in these contexts and outline opportunities for future research. References: https://dl.acm.org/citation.cfm?id=3359418 This course was also presented in Toronto in two additional talks.
8. Reality–Virtuality Continuum
Real Reality Immersive VR
Sem
i-im
m
ersive
VR
D
im
inished
R
ealitySpatialAR
See-Through
AR
Augm
ented
Virtuality
Mixed Reality
Augmented Reality
Extended Reality
9. Challenges
Building a network of healthcare professionals
Identifying current practices
Linguistic Barrier
Big Picture but no Specs
“Automagic worshippers”
79. If you want to know moreBeatriz Peres, Pedro F. Campos, Aida Azadegan, A
Persuasive Approach in Using Visual Cues to
Facilitate Mobility Using Forearm Crutches.
BCSS@PERSUASIVE 2019
Beatriz Peres, Pedro F. Campos, Aida Azadegan ,
Digitally Augmenting the Physical Ground Space
with Timed Visual Cues for Crutch-Assisted Walking.
CHI Extended Abstracts 2019
Beatriz Peres, Pedro F. Campos, Aida Azadegan , A
Digitally-Augmented Ground Space with Timed
Visual Cues for Facilitating Forearm Crutches’
Mobility, Full paper , Interact 2019
.
103. Acknowledgments
ARCADE Afonso Faria, André Domingues
LOCOMOTIVER Alexandre Gordo
VRRRRoom Maurício Sousa, Daniel Mendes, Soraia Paulo
CAVE COLON Pedro Borges, Daniel Medeiros, Soraia Paulo
Voxel TIPs Rita Mendes, Pedro Parreira, Soraia Paulo
Anatomy Studio Maurício Sousa, Daniel Mendes, Rafael dos
Anjos, Soraia Paulo
Voxel Explorer Pedro Parreira, Soraia Paulo
TOOTHFAIRY Filipe Relvas, Soraia Paulo
Laparoscopic HUD Miguel Belo, Soraia Paulo
Minimally Invasive Surgery José Miguel Gomes
ImplantAR Tiago Jerónimo
Plausible Box Figures Nuno Matias
Imperceptible Breathing Filipe Marques
108. D.S. Lopes, D. Medeiros, S.F. Paulo, P.B. Borges, V. Nunes, V. Mascarenhas, M.
Veiga, J.A. Jorge, Interaction Techniques for Immersive CT Colonography: A
Professional Assessment, In: Frangi A., Schnabel J., Davatzikos C., Alberola-
López C., Fichtinger G. (eds) Medical Image Computing and Computer Assisted
Intervention – MICCAI 2018. MICCAI 2018. Lecture Notes in Computer Science,
v o l 11 0 7 1 , P a g e s 6 2 9 – 6 3 7 , S p r i n g e r, C h a m , 2 0 1 8 . D O I :
10.1007/978-3-030-00934-2_70
M. Sousa, D. Mendes, S. Paulo, N. Matela, J. Jorge, D.S. Lopes, VRRRRoom:
Virtual Reality for radiologists in the reading room, Proceedings of the 35th Annual
ACM Conference on Human Factors in Computing Systems (CHI 2017), New
York: ACM Press, 2017. DOI: 10.1145/3025453.3025566
D.S. Lopes, P.F. Parreira, S.F. Paulo, V. Nunes, P.A. Rego, M.C. Neves, P.S.
Rodrigues, J.A. Jorge, On the utility of 3D hand cursors to explore medical volume
datasets with a touchless interface, Journal of Biomedical Informatics, 72, Pages
140–149, 2017. DOI: 10.1016/j.jbi.2017.07.009
Daniel Simões Lopes, Pedro F. Parreira, Ana R. Mendes, Vasco M. Pires, Soraia
F. Paulo, Carlos Sousa, Joaquim A. Jorge, Explicit design of transfer functions for
volume-rendered images by combining histograms, thumbnails, and sketch-based
interaction, The Visual Computer, December 2018, Volume 34, Issue 12, pp
1713–1723. DOI: 10.1007/s00371-017-1448-8
109. Soraia Figueiredo Paulo, Filipe Relvas, Hugo Nicolau, Yosra Rekik, Vanessa
Machado, João Botelho, José João Mendes, Laurent Grisoni, Joaquim Jorge,
Daniel Simões Lopes, Touchless interaction with medical images based on 3D
hand cursors supported by single-foot input: A case study in dentistry, Journal of
Biomedical Informatics, Volume 100, 2019, 103316, DOI: 10.1016/
j.jbi.2019.103316
Daniel Simões Lopes, Filipe Relvas, Soraia Paulo, Yosra Rekik, Laurent Grisoni,
Joaquim Jorge. 2019. FEETICHE: FEET Input for Contactless Hand gEsture
Interaction. In The 17th International Conference on Virtual Reality Continuum
and its Applications in Industry (VRCAI ’19), November14–16, 2019, Brisbane,
Q L D , A u s t r a l i a . A C M , N e w Yo r k , N Y, U S A , 1 0 p a g e s . D O I :
10.1145/3359997.3365704
Ezequiel R. Zorzal, Maurício Sousa, Daniel Mendes, Rafael Kuffner dos Anjos,
Daniel Medeiros, Soraia Figueiredo Paulo, Pedro Rodrigues, José João Mendes,
Vincent Delmas, Jean-Francois Uhl, José Mogorrón, Joaquim Armando Jorge,
Daniel Simões Lopes, Anatomy Studio: a Tool for Virtual Dissection Through
Augmented 3D Reconstruction, Computers & Graphics, DOI: 10.1016/
j.cag.2019.09.006