Contact Us
CiiS Lab
Johns Hopkins University
112 Hackerman Hall
3400 N. Charles Street
Baltimore, MD 21218
Directions
Lab Director
Russell Taylor
127 Hackerman Hall
rht@cs.jhu.edu
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
research.image_overlay [2014/01/27 12:20] che12@johnshopkins.edu |
research.image_overlay [2019/08/07 12:01] (current) |
||
---|---|---|---|
Line 7: | Line 7: | ||
**Method** | **Method** | ||
+ | {{ :ar_goggle.png?240}} | ||
As illustrated in Figure 1, the user wears the optical seethrough goggles (Juxtopia LLC, Baltimore, MD) and a helmet that supports a compact optical tracking system (Micron tracker, Claron Technology, Toronto, CA). The first step is a registration procedure (Fig 1), in which the surgeon uses a tracked probe to touch markers that were affixed to the patient prior to the preoperative imaging. A paired point rigid registration technique computes the transformation that aligns the preoperative data (i.e., tumor outline) to the real world. | As illustrated in Figure 1, the user wears the optical seethrough goggles (Juxtopia LLC, Baltimore, MD) and a helmet that supports a compact optical tracking system (Micron tracker, Claron Technology, Toronto, CA). The first step is a registration procedure (Fig 1), in which the surgeon uses a tracked probe to touch markers that were affixed to the patient prior to the preoperative imaging. A paired point rigid registration technique computes the transformation that aligns the preoperative data (i.e., tumor outline) to the real world. | ||
- | |||
- | {{ ::ar_goggle.png?240 |}} | ||
After registration, the surgeon can see a registered preoperative model overlayed on the real anatomy using the optical seethrough goggles (Figure 2, which illustrates the concept with a skull model, rather than the tumor margin that would be used in an actual surgery). | After registration, the surgeon can see a registered preoperative model overlayed on the real anatomy using the optical seethrough goggles (Figure 2, which illustrates the concept with a skull model, rather than the tumor margin that would be used in an actual surgery). | ||
Line 23: | Line 22: | ||
**Future** | **Future** | ||
- | •Inertial Sensing (e.g., accelerometers, gyroscopes) | + | *Inertial Sensing (e.g., accelerometers, gyroscopes) |
- | •Kalman Filter for Sensor Fusion | + | *Kalman Filter for Sensor Fusion |
- | •Magnification | + | *Magnification |
- | •Eye Tracking | + | *Eye Tracking |
- | **Publications** | + | ===== Publications ===== |
- | 1. Ehsan Azimi, Jayfus Doswell, Peter Kazanzides, ”Augmented Reality Goggles with an Integrated Tracking System for Navigation in Neurosurgery”, IEEE Virtual Reality 2012 | + | -Ehsan Azimi, Jayfus Doswell, Peter Kazanzides, ”Augmented Reality Goggles with an Integrated Tracking System for Navigation in Neurosurgery”, IEEE Virtual Reality 2012 4-8 March. |
- | 4-8 March. | + | -Praneeth Sadda, Ehsan Azimi, George Jallo, Jayfus Doswell and Peter Kazanzides, “Surgical Navigation with a Head-Mounted Tracking System and Display”, Studies in health technology and informatics. Volume 184. Page 363. |
- | 2. Praneeth Sadda, Ehsan Azimi, George Jallo, Jayfus Doswell and Peter Kazanzides, “Surgical Navigation with a Head-Mounted Tracking System and Display”, Studies in health technology and informatics. Volume 184. Page 363. | + | |