Human Subjects Study, Ergonomic Controller and 3D Reconstruction for the Robo-ELF

Last updated: May 13, 2014

Summary

The Robotic Endo-Laryngeal Scope (Robo-ELF Scope) is a robotic system for the manipulation of unmodified clinical flexible endoscopes. From this project, we aim to improve the current state of the Robo-ELF. The goals of this projects are to create a system capable of providing quantitative endoscopic measurements and 3D reconstruction from several monocular endoscopic images, to design ergonomic controller for the robot, and to acquire clinical experiments results.

  • Students: Tae Soo Kim (tkim60@jhu.edu), Jong Heun Kim (jay9204@gmail.com), Steve Park (spark94@jhu.edu)
  • Mentor(s): Kevin Olds (kolds1@jhu.edu), Dr. Jerry Richmon (jrichmo7@jhmi.edu)

Background, Specific Aims, and Significance

Background / Significance

Traditional methods of laryngeal surgery involved disruption of the laryngeal framework which then led to motor and sensory deficits of the patients. Attempts to improve and possibly avoid such external invasive procedures led to advances in endoscopic laryngeal surgery that access the endolarynx through the mouth of the patient. Despite the enhancements to the endolaryngeal surgery triggered by improvements in microlaryngeal instrumentation and endoscopes, the current set up of endolaryngeal surgery continues to have disadvantages. The biggest shortcomings are the reduced depth perception from the monoscopic images, operator’s distance from the surgical field and the crowded workspace for surgeons.[1]

The Robotic Endo-Laryngeal Scope (Robo-ELF Scope) is a robotic system for the manipulation of unmodified clinical flexible endoscopes (Fig 1). It is designed to improve precision, coordination, ergonomics, and surgical capabilities when using flexible endoscopes in the operating room for visualization of the upper airway.[1]

Picture courtesy of Kevin Olds

The current state of the robot enables the surgeon to manipulate the robot using a joystick controller attached to the surgical bed while a real-time endoscope images are displayed on a monitor in the operating room. An unmodified flexible laryngoscope is attached to the robot and the surgeon is able to manipulate the scope with only one hand using the joystick.

Currently, in order to measure the level of subglottic stenosis, the otolaryngologist inserts increasing sizes of endotracheal tube into the stenotic segment to perform a leak test. Then the chosen endotracheal tube size is used to rank the severity of subglottic stenosis. Such an approach itself may alter the target that it is trying to measure and cylindrical endotracheal tubes are not completely apt for sizing stenotic segments. Another noticeable drawback of the Robo-ELF is its user-unfriendly joystick controller.

Specific Aims

The goals of this projects are:

  1. Create a system capable of providing quantitative endoscopic measurements
  2. Create a software that enables 3D reconstruction from several monocular endoscopic images
  3. Design ergonomic controller for the robot
  4. To acquire clinical experiments results.

Deliverables

  • Minimum: (March 26, 2014)
    1. Assist Dr. Richmon in using the Robo-ELF Scope in the OR.
    2. A fully designed ergonomic controller for the Robo-ELF Scope manipulation.
    3. Documentation for the Robo-ELF Scope controller
    4. Software to get real measurements from 2D scope images of an artificial scene.
  • Expected: (April 30, 2014)
    1. Fully interfaced and functioning ergonomic controller with the Robo-ELF.
    2. Software to get real measurements from 2D scope images of the larynx (phantom).
    3. Software documentation
  • Maximum: (May 07, 2014)
    1. Identify the disadvantages with the current prototype (feedback from surgeons) and produce an improved version of the controller.
    2. Software that reconstructs a 3D model of the larynx from monoscopic 2D images

Technical Approach

Robo-ELF Human Subject Study

Upon Institutional Review Board’s approval, the clinical trial of the Robo-ELF will begin. The study procedures involve several tasks to compare the performance of traditional rigid scope and the flexible Robo-ELF.

With the patient’s airway visualized with the laryngoscope, the Robo-ELF flexible scope will be passed through the shaft into the upper airway. The surgeon will manipulate the system using a 3-D joystick controller to evaluate the field of vision. Pictures and videos will be taken of the scene as well as the limits of the visual field. Then, the procedure advances to visualize areas that are traditionally challenging to approach. The surgeon will be manipulating the scope using one hand and the other hand to control a laryngoscopic suction. Further, the Robo-ELF will be fixed in a position to demonstrate the ability to perform two-handed procedures, such as above the vocal chord for a microlaryngeal surgery. Pictures and videos from the tasks will be evaluated by laryngologists to rate the quality of the images and as a novel method of endoscopic exam.

Erogonomic Controller

The new design of the joystick should have intuitive interface which mimics the scopes degrees of freedom, self-reorientation, and gradation control with haptic feedback for more precise velocity manipulation. For intuitive control of the Robo-ELF, the controller should have three active degrees of freedom which are advance/retract, rotate, and up/down. In addition, the controller implements gradation sensor such as potentiometer for more delicate manipulation of the speed of the scope.

For haptic feedback and reorientation of each degree of freedom in the controller, the Gimbal system is implemented.

The complete design assembly would be defined with Computer-Aided-Design software like autocad in detail. Most of the components in the controller design can be printed with 3D printer. Rest of the components such as motors, potentiometers, and metal bars in Gimbal system will be purchased. After acquiring and assembling all the parts of the design, we would test the functionality of the joystick with Arduino microcontroller and see if the controller produce desired gradation results according to the manipulation. Once the functionality of the joystick is verified with Arduino, the controller needs to be able to communicate with the Robo-ELF via Galil motor control. The controller with Galil motor system will be tested to manipulate the movement of the Robo-ELF. If the controller can move the Robo-Elf with all the desired functionalities, we can evaluate the defects of the controller meticulously and try to fix them.

3D distance from 2D image

The vision component of the Robo-ELF project will yield a software that enables the user to measure 3D distance from 2D points on an endoscopic image. The software to produce such measurements require the following information: the focal length of the camera, the stereo baseline measurements and the disparity between stereo images. The block diagram below represents the means of obtaining relevant data and their relation to one another.

The process specified above completes the minimum vision requirement of the project. Given a point P1 in one of the stereo images (assume the left image, L), the software must find the correct corresponding point P1’ in the right image R. There are multiple suitable methods for identifying correspondences between stereo images. A leading idea is to use the SIFT (scale-invariant feature transform) feature descriptor for a specified window size around P1 in the first image and to find the corresponding point in the second image using the SIFT descriptor of P1. The correspondence solution will be found by picking a matching point P1’ in the right image R that minimizes the cost function (in this case, the sum of squared differences between the SIFT descriptor values). Same procedure will be done for P2 as well.

With corresponding point pair (P1, P1’), the algorithm can calculate the disparity between the point pair. Disparity is the measure of the pixel distance between the corresponding points on the image. Having found all necessary parameters (focal length, f, baseline, B, disparity, d), the algorithm can not only find the depth, Z, of a given point P1 in the real world but also find its exact 3D coordinates. The calculation involves simple algebra that involves similar triangles as shown below.

Figure courtesy of Professor Mubarak Shah of University of Central Florida

3D reconstruction

Having solved for the real world coordinates of P1 and P2, simple distance calculation between the two points will yield a real 3D distance. The natural next steps of the project is to build on the stereo baseline 3D distance calculation algorithm and provide a 3D reconstruction algorithm from 2D images of the endoscope. Extend the problem of finding the depth of a single point to all points in the image. In other words, using the technique listed above, depth of all the points in the image will be calculated and as a result, the software can produce a 3D reconstructed model of the airway. Such an algorithm is intuitive and it benefits from the algorithm that we would have already developed for the distance calculation. However, possible shortcoming of the proposed 3D reconstruction technique is that it depends heavily on finding correct correspondences. A possible solution to the problem is to use the baseline information from the robot calibration and the symmetry of the stereo images. Knowing the baseline, the correspondence problem between the stereo images may only have to be solved once only. Such an approach to 3D reconstruction from 2D images still needs investigated further.

Dependencies

  • Access to the Robo-ELF.
  • Images from the scope.
  • Software for 3D reconstruction.
    • OpenCV, Matlab, CISST library ,C/C++
  • Cost.
    • We will be provided if necessary up to $1,000 from the project funding.
    • Majority of the parts of the ergonomic controller will be 3D printed and other mechanical parts will be either provided from the laboratory or can be purchased at a fairly low amount.
    • The nature of the reconstruction project is on a virtual system; many of the tasks will not require spending money.
  • JHU IRB approval.(approved Feb. 24)
  • Medical consult & OR visit.
  • Human Subject Study
    • Compatibility issue of the scope at the lab and the scope in use currently at the hospital.

Milestones and Status

  1. Scope image data collection
    • Planned Date: Feb. 12
    • Expected Date: Feb. 12
    • Status: Completed
  2. Camera calibration
    • Planned Date: Feb. 26
    • Expected Date: Feb. 12
    • Status: Completed
  3. Robot calibration
    • Planned Date: Feb. 26
    • Expected Date: Feb. 20
    • Status: Completed
  4. Initial CAD design of the controller
    • Planned Date: Feb. 26
    • Expected Date: Feb. 24
    • Status: Completed
  5. Order and acquire all controller parts
    • Planned Date: Mar. 12
    • Expected Date: Mar. 26
    • Status: Completed
  6. Build prototype
    • Planned Date: Mar. 19
    • Expected Date: Apr. 30
    • Status: Completed
  7. Interfacing with the robot
    • Planned Date: Mar. 26
    • Expected Date: TBD
    • Status: Canceled until future project
  8. 3D distance from 2D scope images of an artificial scene
    • Planned Date: Mar. 26
    • Expected Date: Mar. 12
    • Status: Complete
  9. 3D distance from 2D scope images of the larynx
    • Planned Date: Apr. 09
    • Expected Date: Apr. 30
    • Status: Completed
  10. Get input from Dr.Richmon about the controller
    • Planned Date: Apr. 09
    • Expected Date: TBD
    • Status: Canceled until future project
  11. 3D reconstruction of an artificial scene using the scope images
    • Planned Date: Apr. 30
    • Expected Date: End of school year of 2015
    • Status: To be implemented
  12. 3D reconstruction of the larynx
    • Planned Date: May. 07
    • Expected Date: End of school year of 2015
    • Status: To be implemented

Reports and presentations

Project Bibliography

  1. Kevin Olds, Alexander T. Hillel, Elizabeth Cha, Martin Curry, Lee M. Akst, Russell H. Taylor, Jeremy D. Richmon. Robotic Endolaryngeal Flexible (Robo-ELF) Scope: A Preclinical Feasibility Study. The Laryngoscope; 2011; 121; 2371- 2374
  2. Darius Burschka and Gregory D. Hager. V-GPS – Image-Based Control for 3D Guidance Systems. In Proc. of IROS, pages 1789–1795, October 2003.
  3. Darius Burschka and Gregory D. Hager. V-GPS(SLAM): – Vision-Based Inertial System for Mobile Robots. In Proc. of ICRA, April 2004.
  4. Olof Enqvist, Fredrik Kahl, Carl Olsson. 2011. Non-Sequential Structure from Motion. Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference. p. 264-271.
  5. Daniel Mirota, Russell H. Taylor, Masaru Ishii, and Gregory D. Hager. Direct endoscopic video registration for sinus surgery. In Medical Imaging 2009: Visualization, Image-guided Procedures and Modeling. Proceedings of the SPIE, volume 7261, pages 72612K-1 - 72612K-8, February 2009.
  6. Kevin Olds, Alexander T. Hillel, Elizabeth Cha, Martin Curry, Lee M. Akst, Russell H. Taylor, Jeremy D. Richmon. Robotic Endolaryngeal Flexible (Robo-ELF) Scope: A Preclinical Feasibility Study. The Laryngoscope; 2011; 121; 2371- 2374
  7. Toan D. Pham, Rucker Ashmore, Lisa Rotelli. 2011. Proportional joystick with integral switch. U.S. Patent 7931101, filed October 13, 2006, and issued April 26, 2011.
  8. Hanzi Wang, Daniel Mirota, and Gregory D. Hager. A generalized kernel consensus based robust estimator. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(1):178-184, 2010.
  9. William M. Wells III, Paul Viola, Hideki Atsumi, Shin Nakajima, Ron Kikinis. 1996. Multi-modal volume registration by maximization of mutual information. Medical Image Analysis. v. 1. p 35–51.

Other Resources and Project Files

  1. Matlab source library for camera calibration and stereo disparity computation. toolbox_calib.zip
  2. Matlab source library for image rectification. rectifkite.zip
  3. The computer vision component, comprehensive source.toolbox_calib.zip
courses/446/2014/446-2014-12/human_subjects_study_for_the_robo-elf_scope_robotic_endoscope_manipulation_system.txt · Last modified: 2014/05/13 11:17 by tkim60@johnshopkins.edu




ERC CISST    LCSR    WSE    JHU