Towards Smart Endoscope: A Data Collection System for Sinus Surgeries

Last updated: May 11th, 6 am

Summary

This project aims to build a data collection system which will enable experiments and further development of Smart Endoscope Project.

  • Students: Ruiqing Yin
  • Mentor(s): Dr. Russell Taylor, Dr. Masaru Ishii, Dr. Chien-ming Huang

Background

Smart Endoscope Project

Functional Endoscopic Sinus Surgery is one of the most performed surgeries among all Sinus Surgeries. Surgeons will have to hold and manipulate endoscope by hand during the whole procedure which can last up to one and an half hours depending on the complexity. The Smart Endoscope project aims to develop a fully autonumous, robotic endoscope manipulator utilizing Galen REMS.

Data Collection System

In the past 30 years, many efforts have been made to automate the manipulation of endoscopes. However, none of them yielded ideal clinical outcomes. The biggest challenge is to make the manipulation as intuitive, and as effortless as possible for surgeons. This requires predicting and controlling the endoscope based on other sensor information. Therefore, to thoroughly understand the hidden rules and signals behind the manipulation of endoscopes, I built an intra-operative data collection system, which captures data of different aspects of the surgery. Analyzing those data together with preoperative CT and statistical data, we are hoping to finally make the endoscopic manipulator 'smart'. Based on literature and surgeon feedback, I built an intra-operative data collection system, which captures the movement of the endoscope, the endoscope video, the movement of a suction tool, and the movement of the patient's head, as well as the gaze information of the surgeon.

Deliverables

  • Minimum: (Expected by March 19th, 2018)
    1. Hardware for a functional data collection system.(DONE)
    2. Workflow for the experiment.(DONE)
  • Expected: (Expected by April 2nd, 2018)
    1. Hardware for a functional data collection system.(DONE)
    2. Workflow for the experiment.(DONE)
    3. Software for a functional data collection system.(DONE)
  • Maximum: (Expected by May 11th, 2018)
    1. Hardware for a functional data collection system.(DONE)
    2. Workflow for the experiment.(DONE)
    3. Software for a functional data collection system.(DONE)
    4. Software for post-processing the collected data.(In Progress)

Data Planned to Be Collected

(Click to enlarge)

Hardware Used in Project

(Click to Enlarge)

Software Used in Project

(Click to Enlarge)

Technical Approach

In order to provide enough data for later machine learning study, a variety of data is planned to be collected:

  1. Traces of the suction tool
  2. Traces of the endoscope
  3. Traces of the patient's head
  4. Position of fiducial points on patient's head
  5. The video stream of the endoscope
  6. Surgeon's gaze mapped onto the endoscopic video

An Aurora EM Tracking system is going to be used to get the position and pose data. The video is recorded by the default camera used during common endoscopic surgeries. A GazePoint gaze tracker is to be implemented on the video display in order to keep tracking of the surgeon's gaze.

Both the EM sensor data and video are planned to be logged using rosbag package in ROS.

(Click to enlarge)

Dependencies

Table. 1 shows the foreseen dependencies of this project as well as their solutions, alternative plan, and current status. In the Status column, Green means solved, Yellow means in progress, Red means overdue and is being replanned. (Click to enlarge)

Milestones and Status

 Feb 26th: Project proposal completion (Completed)

 Mar 9th: Tool adapters design completion (In Progress) - Suction Tool Manufacturing - Endoscope Adapter Manufacturing - Head Adapter Designing

 Mar 19th: Hardware integration completion

 Apr 13nd: Data logging software completion

 Apr 20th: First Test and Troubleshooting

 May 4th: Second Test and Troubleshooting

Reports and presentations

Reading List

[1] S. Horgan and D. Vanuno, ``Robots in laparoscopic surgery”, J. Laparoendosc. Adv. Surg. Techn. A, vol. 11, pp. 415-419, 2001.

[2] M. Hashizume, Fundamental Training for Safe Endoscopic Surgery. Fukuoka, Japan: Innovative Medical Technology, Graduate School of Medical Science Kyushu University, 2005, p. 49 (in Japanese).

[3] Cao, Y., Miura, S., Kobayashi, Y., Kawamura, K., Sugano, S., \& Fujie, M. G. (2016, January). Pupil variation applied to the eye tracking control of an endoscopic manipulator. IEEE Robotics and Automation Letters, 1(1), 531-538.

[4] Cao, Y., Kobayashi, Y., Miura, S., Kawamura, K., Fujie, M. G., \& Sugano, S. (2016, Decem- ber). Pupil variation for use in zoom control. In Robotics

[5] M. Hashizume, Fundamental Training for Safe Endoscopic Surgery, Innovative Medical Technology, Graduate School of Medical Science Kyushu University, pp. 49, 2005 (in Japanese).

[6] K. Jihad, H. George, G. Raj, D. Mihir, A. Monish, R. Raymond, M. Courtenay and G. Inderbir, ``Single-Port Laparoscopic Surgery in Urology: Initial Experience”, Urol, pp. 3-6, 2008.

[7] Y. Cao, Y. Kobayashi, B. Zhang, Q. Liu, S. Sugano, and M. G. Fujie, ``Evaluating proficiency on a laparoscopic suturing task through pupil size,” in Proc. IEEE Int. Conf. Syst.,Man Cybern. (SMC), A17, Oct. 9-12, 2015, pp. 677-681.

Other Resources and Project Files

Here give list of other project files (e.g., source code) associated with the project. If these are online give a link to an appropriate external repository or to uploaded media files under this name space.2018-01

courses/456/2018/456-2018-01/project-01.txt · Last modified: 2018/05/11 06:13 by ryin6@johnshopkins.edu




ERC CISST    LCSR    WSE    JHU