An Improved GUI and Vision-based Guidance for the Robo-ELF

Last updated: 3-30-12

Summary

The purpose of this project is to design and implement an improved user interface for the Robo-ELF, specifically a new robust GUI and vision-based point-and-click, motion control. The current system is undergoing review for FDA approval for clinical trials, and finishing the requirements for approval of the current system is also a major project goal.

  • Students: Jon Kriss
  • Mentor(s): Kevin Olds, Russell Taylor, Dr. Jeremy Richmon

Background, Specific Aims, and Significance

The Robotic EndoLaryngeal Flexible Scope(Robo-ELF) solves the problem of visualization in the the throat during endoscopic surgery. It provides a means to hold and position a flexible endoscope during surgery so the surgeon has both hands free to operate. A prototype robot was built and tested in 2011 using phantoms and human cadavers. It functioned well in testing and surgeons were pleased with the results. One complaint was that the control mechanism was unintuitive and difficult to use.

The ultimate goal of the project started in 2011 was to produce a clinically viable system and put it through human trials. The documentation and approval process to do this is almost complete. The largest remaining task is passing FDA requirements for a clinical system. Finishing these requirements is the first goal of this project.

A secondary goal of this project is to produce a more user-friendly control interface. Vision-based guidance using a point-and-click display is much more intuitive and easier to use than the current joystick interface. A more robust GUI will also be of use to the surgeons and is a natural progression for the system.

Deliverables

  • Minimum: Expected by April 20
    1. Complete requirements to send to FDA for approval of clinical trials
    2. Complete system design review
    3. Complete software design review
    4. User manual
    5. Documentation and FMEA
  • Expected: Expected by April 22
    1. Implement Vision-Based Navigation
    2. svlFilter for guidance code
    3. Integrate vision guidance with current control method
  • Maximum: Expected by May 4
    1. New robust surgical GUI
    2. Visual representation of robot on screen
    3. Provide quick access to pre-operative images

Technical Approach

To complete the requirements for clinical trials, several things must be accomplished. All of the safety features, especially software, must be tested and documented for validation. This includes the software-activated emergency stop, heartbeat signal, encoder/potentiometer checking, joystick failure detection, and Galil error detection. Most of these features are already implemented, but they require more testing and documentation before being submitted for approval. Better handling of system errors should also be implemented. Errors will be split into two categories, serious and non-serious. Serious errors will require a full system restart to continue and are indicative of a serious system failure. Non-serious errors are recoverable without restarting the system and do not present a danger to the patient. Mechanical changes to the system must also be completed to allow for easier draping and disassembly. Renata Smith and Kevin Olds are responsible for completing these changes, as well as designing a required draping system for the robot.

Once the changes and documentation are completed, we must complete an FMEA risk analysis of the system and validate that all risks have been properly compensated for. Part of this compensation is a full software review which will be completed first. These system reviews will be conducted with all team members and senior members of the LCSR faculty and staff.

Vision-based navigation will be implemented using algorithms developed and tested in summer 2011 by visiting student Hongho Kim. The algorithm uses an approximate kinematic model for the flexible scope to estimate its orientation. It uses template matching to confirm it is moving in the right direction. It was developed and tested using OpenCV. Our task is to implement the same algorithm as a CISST svlFilter and integrate the results into the current control code for the robot. The new GUI will be implemented in Qt with the CISST libraries. We will meet with the surgeons to discuss the exact design and layout of the GUI and to discuss which features they would find most useful.

Dependencies

describe dependencies and effect on milestones and deliverables if not met

Milestones and Status

  1. Milestone name: Complete FDA Requirements
    • Planned Date: 2/21
    • Expected Date: 4/20
    • Status: Software complete. Finishing documentation.
  2. Milestone name: Complete Vision-Based Navigation
    • Planned Date: 2/21
    • Expected Date: 4/22
    • Status: Delayed(Not achievable this semester)
  3. Milestone name: Complete New Surgical GUI
    • Planned Date: 2/21
    • Expected Date: 5/4
    • Status: In development

Reports and presentations

Project Bibliography

  • Olds, K., Hillel, A. T., Cha, E., Curry, M., Akst, L. M., Taylor, R. H. and Richmon, J. D. (2011), Robotic endolaryngeal flexible (Robo-ELF) scope: A preclinical feasibility study. The Laryngoscope, 121: 2371–2374. Link
  • Reilink, R.; Stramigioli, S.; Misra, S.; , “Image-based flexible endoscope steering,” Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on , vol., no., pp.2339-2344, 18-22 Oct. 2010. Link
  • McLeod, I.K., Mair, E.A., Melder, P.C. Potential applications of the da Vinci minimally invasive surgical robotic system in otolaryngology. Ear Nose Throat J 84, 483-487, 2005. Link
  • Buckingham, R.O.; Graham, A.C.; , “Computer controlled redundant endoscopy,” Systems, Man, and Cybernetics, 1996., IEEE International Conference on , vol.3, no., pp.1779-1783 vol.3, 14-17 Oct 1996. Link
  • Ott, L.; Nageotte, F.; Zanne, P.; de Mathelin, M.; , “Robotic Assistance to Flexible Endoscopy by Physiological-Motion Tracking,” Robotics, IEEE Transactions on , vol.27, no.2, pp.346-359, April 2011 Link

Other Resources and Project Files

Current code is kept in the LCSR repository under apps (Throat Robot) and can be found here.
Our completed FMEA. pdf, excel

courses/446/2012/446-2012-13/project13.txt · Last modified: 2012/12/10 17:36 (external edit)




ERC CISST    LCSR    WSE    JHU