Utilizing image guided surgery for user interaction in medical augmented reality

DSpace Repositorium (Manakin basiert)


Dateien:

Zitierfähiger Link (URI): http://nbn-resolving.de/urn:nbn:de:bsz:21-opus-16617
http://hdl.handle.net/10900/48730
Dokumentart: Verschiedenartige Texte
Erscheinungsdatum: 2005
Originalveröffentlichung: WSI ; 2005 ; 4
Sprache: Englisch
Fakultät: 7 Mathematisch-Naturwissenschaftliche Fakultät
Fachbereich: Sonstige - Informations- und Kognitionswissenschaften
DDC-Klassifikation: 004 - Informatik
Schlagworte: Erweiterte Realität <Informatik> , Visualisierung , Mensch-Maschine-Kommunikation , Computergraphik
Freie Schlagwörter: Medizinische Visualisierung , Intraoperative Navigation
Image Guided Surgery
Lizenz: http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=de http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=en
Gedruckte Kopie bestellen: Print-on-Demand
Zur Langanzeige

Abstract:

The graphical overlay of additional medical information over the patient during a surgical procedure has long been considered one of the most promising applications of augmented reality. While many experimental systems for augmented reality in medicine have reached an advanced state and can deliver high-quality augmented video streams, they usually depend heavily on specialized dedicated hardware. Such dedicated system components, which originally have been designed for engineering applications or VR research, often are ill-suited for use in the clinical practice. We have described a novel medical augmented reality application, which is based almost exclusively on existing, commercially available, and certified medical equipment. In our system, a so-called image guided surgery device is used for tracking a webcam, which delivers the digital video stream of the physical scene that is augmented with the virtual information. In this paper, we show how the capability of the image guided surgery system for tracking surgical instruments can be harnessed for user interaction. Our method enables the user to define points and freely drawn shapes in 3-d and provides selectable menu items, which can be located in immediate proximity to the patient. This eliminates the need for conventional touchscreen- or mouse-based user interaction without requiring additional dedicated hardware like dedicated tracking systems or specialized 3-d input devices. Thus the surgeon can directly interact with the system, without the help of additional personnel. We demonstrate our new input method with an application for creating operation plan sketches directly on the patient in an augmented view.

Das Dokument erscheint in: