This is an old revision of the document!


Data Integration during Robotic Ultrasound-guided Surgery

Last updated: Date and time

Summary

Our project aims at improving the existing interface for the da Vinci robot console and make it ready to be used clinically for Ultrasound guided surgery. Surgical collaborators will be recruited for clinical testing to determine the effectiveness of the interface in minimizing the difference in success of liver surgeries between novice and expert surgeons. Further improvements will be incorporated into the interface to allow the user to manipulate 3D lesions and fiducials within the da Vinci console itself.

  • Students: Vineeta Khatuja, Andrew Wang, Tifany Yung
  • Mentor(s): Michael Choti MD MBA, Theodoros Katsichtis, Colin Lea, Russell Taylor PhD

Background, Specific Aims, and Significance

Computer-assisted laparoscopic procedures are becoming increasingly common. Laparoscopies allow for minimally invasive procedures in the abdominal cavity using a few small incisions as opposed to one large one in an open surgery. These laparoscopies can be augmented with ultrasonography which provides real-time imaging to update the preoperative data. Laparoscopic ultrasound can greatly improve the efficacy of many operations including increasing the accuracy of biopsies and ablations as well as scanning and detecting lesions on the liver, kidney and many other organs.

A robotic platform offers the surgeon an array of advantages compared to conventional open and laparoscopic techniques. Stereoscopic 3D vision, 7 degrees of motion and physiologic tremor attenuation allow the surgeon to be more precise and competent.

Real time ultrasonography is commonly used in modern open, laparoscopic and robotic surgery. Ultrasound data guide the surgical decision making process intraoperatively when localizing a lesion and determining its position relative to other structures.

We propose the development and utilization of a novel da Vinci interface, integrating and displaying live intraoperative US as well as additional preoperative data (CT, MRI, X-ray and DICOM) to the surgeon. Displaying all the necessary data to the surgeon through the Da Vinci master console can potentially increase the ergonomy and efficacy of robotic procedures improving patient outcome.

Current System

  1. Inputs
    • There are three inputs to the surgical workstation.
      1. Camera 1
      2. Camera 2
      3. Live Ultrasound
  2. Surgical WorkStation
    • It is responsible for processing the input data and perform necessary calculations.
  3. Surgical Console
    • This is the console seen by the surgeon while performing the surgical procedure.

Tile Pro

TilePro is a software feature on the da Vinci. The feature partitions the visual space which displays the interface that the surgeon sees on the console.

Out implementation of TilePro takes uses the two slots available in the da Vinci console:

  1. Upper slot : Displays the the 3D live video of the operative field using the Camera 1 and Camera 2 inputs.
  2. Lower Slot : Displays the live ultrasound video, image browser and other related items.

Deliverables

  • Minimum:
    1. Acquire software dependencies, Mock OR access. - Complete.
    2. Propose budget and acquire funding for da Vinci usage ($50/hr) from Dr. Taylor. - Complete.
    3. Preoperative image viewer: implement saving and browsing of US images on the console. - Complete.
    4. Preliminary nonclinical study design in mock OR. - Complete.
    5. Masters as Mice implementation. - Complete.
  • Expected:
    1. Clinical study design. - Complete.
    2. Submit and have IRB approved. - Complete.
    3. Contact surgical collaborators for clinical study. - In progress.
    4. Schedule and confirm participating surgeons for study. - In progress.
    5. Perform test and deployment of software in the mock OR. - Complete.
    6. Changes to software after nonclinical testing. - In progress.
  • Maximum:
    1. Complete clinical study of software at the hospital with actual patients. - Not expected.
    2. Features/changes requested from clinical tests. - Not expected.
    3. Tool to measure lesion size on the ultrasound image. - Not expected.
    4. Use 3D fiducials to show previously viewed areas. - Not expected.
    5. Build 3D model of anatomical structures from CT. - Not expected.
    6. Provide flexible view manipulation of 3D models. - Not expected.
  • Possible follow-ups:
    1. Speech-to-text intrap notes.
    2. 3D representation of the US probe in the UI.
    3. Enable 3D fiducial placement on the 3D models, not just the US images.

Technical Approach

  • Software:
    1. Remove the 3D lesion mapping tool.
      • Understand the LapUs code and identify the code to be removed.
    2. Real time operative tool measurement.
      • Use the daVinci API to get the operative field measurement tool.
    3. Lesion measurement tool.
      • Perform calibration between the US images and user interface.
    4. User-friendly interface.
      • Less disruptive color scheme.
      • Save and browse capabilities with the US images, with lesion descriptions.
  • Non-Clinical Testing:
    • Once the baseline interface is complete (3D lesion mapping is removed, color scheme and interface component arrangement improved), our team members Andrew and Tifany will perform non-clinical tests on the interface. These tests aim to assess the stability and compatibility of the baseline interface on the Hopkins da Vinci system. The tests will be performed using the da Vinci robot in the Hackerman mock OR on a gelatin phantom liver with pseudo-lesions. We will perform the tasks of identifying and locating lesions using the LapUS interface. Since the interface has not been tested for more than a period of 10 minutes, our goal is to use the interface for a more extensive period of time to check for any ossibility of the UI crashing during use and to discover any possible glitches or bugs.
    • Once the enhanced interface is complete (3D organ model manipulation, 3D lesion mapping properly implemented, etc.), another stability and functionality test will be performed to discover potentials for crashes or glitches.
  • Clinical Testing:
    • A clinical study will then be performed to test the utility and functionality of these added features. Since we plan to have surgeons test the interface during procedures on actual patients, we will require an IRB approval. A questionnaire will be issued to the surgeons participating in the study that will determine their satisfaction and comfort with interface. A supplementary questionnaire will also be issued to determine the surgeons’ opinions of the LapUS interface with respect to having just the da Vinci live camera feed.

Dependencies

  1. Windows computer compatible with software dependencies.
    • Required to install software dependencies and begin work on the interface.
    • Status: resolved.
  2. QT Creator IDE (application framework for GUI development)
  3. CISST (libraries for development of computer-assisted intervention systems)
  4. Intuitive API (source code for control of the Da Vinci surgical robot)
    • Requires signed non-disclosure agreement between Intuitive Surgical and each group member.
    • Status: resolved
  5. Video grabber
    • Install onto laptop to allow video connection with da Vinci console.
    • Status: resolved.
  6. Budget proposal for project funds to use Mock OR.
    • Mock OR costs $50/hr to use.
    • Status: resolved.
  7. IRB proposal submission.
    • IRB proposal is needed from the Homewood IRB to conduct clinical studies.
    • Status: resolved.
  8. Liver phantom
    • Gelatin phantom with pseudolesions recommended by Dr. Choti.
    • Cellulose and agarose/glycerol lesion imitations embedded in collagen phantom.
    • Status: resolved.
  9. Surgical/clinical collaborators
    • Needed as subjects and to assist with clinical testing of the new integrated interface.
    • Status: in progress.
  10. Mock OR and Da Vinci robot access
    • Permission must be granted by Dr. Taylor to each member.
    • Status: resolved.

Milestones and Status

  1. Obtain Mock OR access.
    • Planned Date: 2/22/13
    • Expected Date: 2/21/13
    • Status: Complete.
  2. Set up software dependencies for GUI development.
    • Planned Date: 2/22/13
    • Expected Date: 3/29/13
    • Status: Complete.
  3. US images save and browse on console.
    • Planned Date: 3/8/13
    • Expected Date: 3/29/13
    • Status: Complete.
  4. Testing/debugging plan.
    • Planned Date: 3/1/13
    • Expected Date: not expected by end of semester.
    • Status: In progress.
  5. Contact surgical collaborators for clinical study
    • Planned Date: 3/1/13
    • Expected Date: not expected by end of semester.
    • Status: In progress.
  6. Have participating surgeons scheduled and confirmed.
    • Planned Date: 3/8/13
    • Expected Date: not expected by end of semester.
    • Status: In progress.
  7. IRB for clinical study
    • Planned Date: 4/1/13
    • Expected Date: 4/30/13
    • Status: Complete.
  8. Complete baseline interface testing in the Mock OR.
    • Planned Date: 3/22/13
    • Expected Date: 4/25/13
    • Status: Complete.
  9. Masters as Mice.
    • Planned Date: 4/5/13
    • Expected Date: 4/25/13
    • Status: Complete.
  10. Ability to manipulate a 3D model of a lesion or organ
    • Planned Date: 4/19/13
    • Expected Date: not expected.
    • Status: Not started.
  11. Use 3D fiducials to show previously viewed areas
    • Planned Date: 5/3/13
    • Expected Date: not expected.
    • Status: Not started.
  12. Clinical study design (task experiments, user feedback survey).
    • Planned Date: 4/19/13
    • Expected Date: 2/22/13
    • Status: Complete.
  13. Complete clinical testing of the software at the hospital.
    • Planned Date: 5/3/13
    • Expected Date: not expected by end of semester.
    • Status: Not started.

Reports and presentations

Project Bibliography

  • Bartosz F. Kaczmarek, S. S., Firas Petros, Quoc-Dien Trinh, Navneet Mander, Roger Chen, Mani Menon, Craig G. Rogers (2012). “Robotic ultrasound probe for tumor identification in robotic partial nephrectomy: Initial series and outcomes.” International Journal of Urology.
  • Caitlin M. Schneider, B. P. D. P., MD; Russell H. Taylor, PhD; Gregory W. Dachs II, MS; Christopher J. Hasser, PhD; Simon P. DiMaio, PhD; Michael A. Choti, MD, MBA, FACS Surgical Technique: Robot-assisted laparoscopic ultrasonography for hepatic surgery.
  • Caitlin M. Schneider, G. W. D. I., Christopher J. Hasser, Michael A. Choti, Simon P. DiMaio, Russell H. Taylor Robot-Assisted Laparoscopic Ultrasound, Johns Hopkins University; Johns Hopkins Medicine; Intuitive Surgical, Inc.
  • Craig G. Rogers, M. R. L., MD; Akshay Bhandari, MD; Louis Spencer Krane, MD; Daniel Eun, MD; Manish N. Patel, MD; Ronald Boris, MD; Alok Shrivastava, MD; Mani Menon, MD (2009). “Maximizing Console Surgeon Independence during Robot-Assisted Renal Surgery by Using the Fourth Arm and TilePro.” Journal of Endourolgy 23(1): 115-121.
  • Francesco Volonté, N. C. B., François Pugin, Joël Spaltenstein, Boris Schiltz, Minoa Jung, Monika Hagen, Osman Ratib, Philippe Morel (2012). “Augmented reality to the rescue of the minimally invasive surgeon. The usefulness of the interposition of stereoscopic images in the Da Vinci robotic console.” The International Journal of Medical Robotics and Computer Assisted Surgery.
  • Joshua Leven, D. B., Rajesh Kumar, Gary Zhang, Steve Blumenkranz, Xiangtian (Donald) Dai, Mike Awad, Gregory D. Hager, Mike Marohn, Mike Choti, Chris Hasser, Russell H. Taylor DaVinci Canvas: A Telerobotic Surgical System with Integrated, Robot-Assisted, Laparoscopic Ultrasound Capability, The Johns Hopkins University; Intuitive Surgical, Inc.

Other Resources and Project Files

courses/446/2013/446-2013-09/446-2013-09.1368397166.txt.gz · Last modified: 2013/05/12 18:19 by tyung1@johnshopkins.edu




ERC CISST    LCSR    WSE    JHU