3D Tool Tracking in the Presence of Microscope Motion

Last updated: May 18th, 2017

Summary

The goal of this project is to accurately track 3D tool motion from stereo microscope video. This tool motion data will be used to analyze tool tremor in manual and robot-assisted surgery.

  • Students: Molly O'Brien
  • Mentor(s): Dr. Austin Reiter, Dr. Russell Taylor

Background, Specific Aims, and Significance

In microsurgery surgeons need to get within a millimeter of critical structures. To make these procedures safer we want to introduce a surgical robot that can enforce safety barriers. But, during surgery the patient may move. To accurately enforce safety barriers around patient structures we need to track the robot motion with respect to the patient with submillimeter accuracy. Existing tracking systems are insufficient because they can only track with 1.5-3 mm accuracy reliably[1]. To achieve better tracking accuracy we want to:

  1. Track the camera motion through the microscope
  2. Track the tool tip motion through the microscope
  3. Compute the 3D motion of the tool without camera motion

We also want to understand how a surgical robot reduces surgeon hand tremor. To achieve this we will:

  1. Perform frequency analysis on video of hand-held and robot-held tools.

Deliverables

  • Minimum: Expected by April 19th
    1. An algorithm to triangulate 3D points from stereo video and track background motion (with fiducial points)
    2. Tool tracking algorithm using microscope video of painted tools
  • Expected: Expected by April 28th
    1. Validation of triangulated points using known world motion
    2. Frequency analysis of the tool tip motion from a stereo video
  • Maximum: Expected by May 12th
    1. Comparison of hand-held and robot-held tool tremor

Technical Approach

  1. Record data:
    • Goal: Record microscope video of background and tool moving known distances to validate microscope tracking. Record video of suturing experiment on a chicken phantom.
    • Steps:
      1. 3D Print background fiducial (“chicken holder”)
      2. Paint color fiducials on surgical tools
      3. Record microscope video
      4. Video frame from chicken suturing experiment:

  1. Compute 3D microscope motion:
    • Goal: Compute the 3D background motion in microscope video. Assuming the background is stationary in the world, the perceived background motion in the video is the camera motion.
    • Steps:
      1. Calibrate microscope
      2. Extract fiducial points from the background phantom
      3. Triangulate 3D fiducial point locations
      4. Find correspondences between 3D points in different video frames
      5. Compute the 3D transformation between the point clouds in consecutive frames. This 3D transformation is the camera motion between the frames.
  2. Find 3D tool tip position:
    • Goal: Accurately detect the tool tip in the video frames, then triangulate the 3D tool tip position.
    • Steps: In the left and right image for each stereo video frame:
      1. Detect color fiducials on tool
      2. Triangulate the 3D tool tip position.
  3. Motion Analysis:
    • Goal: Compute the frequencies of motion in the observed tool motion. Compare the magnitude of high frequency tremor in hand-held and robot-held tools.
    • Steps:
      1. Take the Fourier transform of the optical tracking results
      2. Use the computed microscope motion to transform the detected tool tip positions into the coordinate frame of the first image
      3. Take the Fourier transform of the tool tip positions in the coordinates of the first
      4. Compare the frequency analysis results from the hand-held and robot-held data.

Dependencies

  1. Access to microscope and video capture computer
    • Resolution Plan: I will coordinate with Dr. Taylor and lab.
    • Status: Resolved
  2. Chicken holding phantom (Background fiducial phantom)
    • Resolution Plan: Enlist other members of the lab to help me.
    • Status: Resolved.
  3. Access to robot
    • Resolution Plan: Determine when robot will be needed. Coordinate with Dr. Taylor and lab.
    • Status: Resolved
  4. Access to tools
    • Resolution Plan: Determine when robot will be needed. Coordinate with Dr. Taylor and lab. Obtain permission to add an optical marker to a tool.
    • Status: Resolved
  5. Access to optical tracking system. Help from Paul to use system
    • Resolution Plan: Coordinate with Dr. Taylor and Paul.
    • Status: Resolved, I am no longer using the optical tracking system so it is no longer a dependency.

Milestones and Status

  1. Milestone name: Record Ground Truth Data
    • Planned Date: 3/15
    • Expected Date: 3/31
    • Completed 3/31
    • Status: I am not using the optical tracking system for ground truth. Instead I recorded microscope video of the background “chicken holder” and the tool moving known distances. I moved the chicken holder along a graph paper background:
    • I moved the painted tool on top of a checkerboard background:
    • I also recorded video of freehand and robot-assisted suturing:
  2. Milestone name: Compute Background Motion
    • Planned Date: 3/01
    • Expected Date: 4/21
    • Completed: 5/4
    • Status: I have completed bundle adjustment to accurately stabilize the background points. The
  3. Milestone name: Implement Video Tracking Algorithm
    • Planned Date: 5/03
    • Expected Date: 4/26
    • Completed: 5/4
    • Status: The color marker tracking code is completed. The blue markers in the background are tracked using a template of the marker layout. The green markers on the tools are tracked individually.
  1. Milestone name: Frequency Analysis
    • Planned Date: 5/12
    • Expected Date: 5/12
    • Completed: 5/12
    • Status: I have completed the frequency analysis code. I can take in 3D points, form trajectories in time, and get the magnitude and angle of the frequency response. My

code can filter for tremor and take the inverse DFT to compare the tool path with and without tremor.

Reports and presentations

Project Bibliography

References:

[1] P. Grunert, K. Darabi, J. Espinosa, and R. Filippi, “Computer-aided navigation in neurosurgery,” Neurosurgical Review, vol. 26, no. 2, pp. 73-99, 2003.

Reading List:

  • S. Leonard, A. Reiter, A. Sinha, M. Ishii, R. Taylor, and G. Hager, “Image-Based Navigation for Functional Endoscopic Sinus Surgery Using Structure From Motion,” in SPIE, San Diego, 2016.
  • B. Allen, F. Kasper, G. Nataneli, E. Dutson, and P. Faloutos, “Visual Tracking of Laparoscopic Instruments in Standard Training Environments,” in MMVR, Newport Beach 2011.
  • R. Sznitman, K. Ali, R. Richa, R. Taylor, G. Hager, and P. Fua, “Data-driven visual tracking in retinal microsurgery. In Medical Image Computing and Computer-Assisted Intervention,” in MICCAI, Nice 2012.
  • Loubna Bouarfa, Oytun Akman, Armin Schneider, Pieter P. Jonker and Jenny Dankelman (2012) In-vivo real-time tracking of surgical instruments in endoscopic video, Minimally Invasive Therapy & Allied Technologies, 21:3, 129-134, DOI: 10.3109/13645706.2011.580764
  • W. Zhao, C. Hasser, W. Nowlin, and B. Hoffman, “Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information,” U.S. Patent 8108072 B2, Jan 31, 2012.
  • A. Cano, F. Gaya, P. Lamata, P. Sanchez-Gonzalez, and E. Gomez, “Laparoscopic Tool Tracking Method for Augmented Reality Surgical Applications,” in LNCS, vol. 5104, pp. 191-196, 2008.

Other Resources and Project Files

Tremor Analysis code: tremor_analysis.tar.gz

Github code repository: mollynnobrien/Tremor_Analysis

courses/446/2017/446-2017-04/project.txt · Last modified: 2019/08/07 16:01 by 127.0.0.1




ERC CISST    LCSR    WSE    JHU