Robot Assisted Laparoscopic Ultrasound Elastography (RALUSE)

Project Goals

The goal of this project is to facilitate robot-assisted minimally invasive ultrasound elastography (USE) for surgeries involving the daVinci surgical robot. Information from USE images will extend the range of applicable procedures and increase the robot’s capabilities. The daVinci system brings many surgical advantages, including minimal invasiveness, precise and dexterous motion control, and an excellent visual field. These advantages come at the cost of surrendering the surgeon’s sense of feel and touch. Haptic information is significant for procedures such as organ-sparing resections that target removal of hidden tumors, for example. In open surgery, a surgeon may locate an embedded tumor by manual palpation of tissue. Without haptic sensory information, robotic surgery requires other means to locate such tumors. USE imaging is a promising method to alleviate the impact of haptic sensory loss, because it recovers the stiffness properties of imaged tissue. USE peers deep into tissue layers, providing localized stiffness information superior to that of human touch.

Approach

da Vinci robot slave arm controlling the ultrasound probe

Generating an elastography image requires applying varying levels of strain to underlying tissue at a periodic rate. Tissue strain is generated by palpating an ultrasound probe against the tissue surface. During palpation, the ultrasound probe images the tissue under different amounts of strain; from two of these images a displacement map is generated, which can be used to determine the relative stiffness of underlying tissue. A high quality elastogram requires careful control of the magnitude and orientation of applied palpation. A uniform palpation providing continuous image quality is difficult to manage manually, and increases in difficulty for a minimally invasive interventional settings such as the da Vinci robot. However, because the robot is aware of its position and is capable of applying very precise motion control over the ultrasound probe, there is no need for the user to control the palpation manually.

The approach taken by this work is to use the robot's motion control capability to automatically generate periodic palpation of an ultrasound probe required for producing an elastography image stream. As shown in the system overview figure, the automated palpation trajectory is overlaid onto motion inputs from the user, which allows the user to maintain high-level control of the ultrasound probe and orientation of imaging, while at the same time being freed from the cognitive burden of continuously applying ultrasound palpation. This feature not only reduces stress to the user but also enables increased elastography image quality, since the da Vinci system is far more capable of producing a precise and continuously uniform palpation trajectory than a human user.

RALUSE System Overview

The ability to automatically command motion inputs to the da Vinci robot is supported via the Read-Write da Vinci Research API provided to Johns Hopkins University through a research partnership with Intuitive Surgical, Inc. It is important to note that the capabilities of this API is reserved for research systems only, and this API not approved nor compatible for use with clinical da Vinci systems.

Control over the entire system is integrated through the open-source Surgical Assistant Workstation (SAW) software framework. Although open-source, SAW's design allows compatibility with proprietary software modules through module wrappers and well-defined interface protocol's. Using this feature, the Read-Write da Vinci Research API has been integrated with the SAW-based framework for this application.

Both the ultrasound and elastography image feeds are displayed as picture-in-picture overlays in the video feed of the da Vinci console. This allows the user to observe real-time imaging while exercising control over the robot. The 3D User Interface capability of SAW enables the surgeon to enable/disable the image overlays and to manipulate the size and location of the images in the field of view using the master controllers in what we call “masters-as-mice-mode” as shown in the image below.

CISST 3D User Interface

Results

The capability of the system to generate accurate elastography images was tested using a Model 049 Elasticity QA Phantom from CIRS. This phantom contains simulated lesions of varying stiffness. The following figure shows a screen-shot of the video feed shown to the user sitting at the da Vinci console while imaging two lesions of different stiffness simultaneously. As shown in the elastography image, the hard lesion appears darker, having much greater contrast with background tissue, compared to the soft lesion. Both lesions are also visible in the matching ultrasound image, where they appear very similar. Thus, the elastography image provides additional information concerning tissue stiffness, which the ultrasound image cannot provide.

Elastography phantom being imaged by the RALUSE system

Project Personnel

JHU, Whiting School of Engineering

Faculty

  • Russell H. Taylor
  • Emad Boctor

Research Staff

  • Anton Deguet
  • Balazs Vagvolgyi

Graduate Students

  • Seth Billings
  • Nishikant Deshmukh
  • Hyun Jae Kang

Funding

  • NIH Graduate Partnership Program
  • JHU Internal Funds

Affiliated labs

Publications

* Seth Billings, Nishikant Deshmukh, Hyun Jae Kang, Russell H. Taylor, Emad Boctor, “System for Robot-Assisted Real-Time Laparoscopic Ultrasound Elastography”, SPIE Medical Imaging 2012.

research.robot_assisted_laparoscopic_ultrasound_elastography.txt · Last modified: 2019/08/07 16:01 by 127.0.0.1




ERC CISST    LCSR    WSE    JHU