======Realtime Feedback Tool for Nasal Surgery======
**Last updated: 2016-5-06 12:30 AM**
======Summary======
Septoplasty is a surgical procedure that aims to correct deviated septum, a condition which may cause breathing difficulty. Septoplasty is one of the first procedures residents learn in their residency but also one of the last procedure they actually master. This long learning curve is caused by low visibility on the surgical target, which makes it hard for residents to see how this procedure can be correctly done. By adding line of cut visualization, we hope to be able to help the resident learn and evaluate their performance.
* **Students:** Michael Norris, Felix Jonathan
* **Mentor(s):** Narges Ahmidi, Dr. Masaru Ishii, Dr. Lisa Ishii
======Background, Specific Aims, and Significance======
Septoplasty is a surgery that uses a surgeon’s instinct in estimating the nasal region to cut. Training residents in septoplasty is hard because the residents cannot see the surgery being performed by the fellow because of the small and limited viewing region. There is no existing technology that is being used for visualizing the septum surface and visualizing nasal cutting tool line-of-cut for septoplasty surgery. This tool will aid in training surgeons by providing visualization of the predicted line of cut for the surgeon.
======Deliverables======
* **Minimum:** (Initially expected by 3/14, Plans modified to 4/3, Completed)
- Training procedure for any model of scissors. Completed.
- Line of Cut prediction for scissors. Completed.
- Visualizing line of cut prediction and septum surface / phantom. Completed.
- Septum surface reconstruction by tracing the actual septum. Completed.
- Documentation for all software and mechanical designs. Completed.
* **Expected:** (Expected by 4/1, Plans modified to 5/1)
- Realtime visualization of line of cut prediction on septum surface (> 15 Hz refresh rate) Completed
- Http based web service to send data from existing software to our project Revised: Not needed anymore
- Software that validates the accuracy of a cut with respect to the prediction on the phantom Deliverable was poorly defined (changed to Validation deliverable)
- Validation and accuracy measurement for all components. Completed
- Mesh reconstruction from point cloud data. Not Completed
* **Maximum:** (Expected by 4/14, Plans modified to 5/1)
- Reasonable accuracy for line of cut prediction (to be updated when we get complete specification of every sensor and tracker we use) *Revised: Any further improvement in accuracy. Completed
- Septum surface reconstruction by randomized septum surface touching. Not completed.
- Image projection of anatomy onto surface mesh. Not completed.
======Technical Approach======
===== Scissor Model Training (minimum deliverable)====
Since the surgical scissors might have some variation in the blade design, cut trajectory, and EM mounting position, prior scissors trainings are needed for visualizing line of cut.
There are 2 main basic scissor model property that need to be resolved for most scissors:
* Scissor pinch point local coordinate frame relative to scissor's EM sensor frame.
* Scissor blade range/length
=== Summary of the Procedure ===
- Put the EM sensor on the Scissor and another EM sensor on the training plane.
- Do pivot calibration on EM Pointer to get accurate tip position. The pivot calibration is already provided in Aurora Application module.
- Use EM pointer to calculate the scissor cutting plane normal by moving the tip arbitrary oround the scissor blade surface.
- Use EM pointer to estimate the training plane normal vector by moving the tip arbitrary around the training plane surface. We will use PCL SACsegmentation algorithm to estimate the plane parameter.
- Make a line with the EM pointer on the training plane surface to get the direction of the cut and the starting point for the scissor cut.
- Calculate the line position relative to the training plane.
- Use Scissor to cut the training plane.
- The scissor pinch point position is where the cut starting point is made on the plane. By doing frame transformation on the scissor relative to the septum plane, the scissor pinch point position relative to the scissor EM sensor can be retrieved directly.
- The scissor normal cross product with the septum plane is equal to the direction of the cut. Since we have the direction of the cut, the scissor plane, and the septum plane normal, we could estimate the error in scissor normal plane estimation.
- To improve the result accuracy, do cut training several time and estimate the best result using linear least square.
===== Septum Surface data generation (minimum deliverable)====
The septum surface is approximated with a 2D convex hull surface in 3D space.
=== Summary of the Procedure ===
- Put an EM sensor to patient's forehead.
- The L-strut region of the nose, which is a very critical structure for septoplasty procedure is tightly tied to the nasal bridge position (see Fig. 1). To establish the estimated position of the L-strut, the nasal bridge (NB) needs to be traced with the EM pointer.
- Transform the collected nasal bridge point cloud from EM pointer to patient's forehead coordinate system.
- To approximate the shape of the septal nasal cartilage (NS), points in that region are needed. This data can be generated by tracing the septum surface using the EM pointer tool with a zig-zag pattern to generate a point cloud.
- Transform the collected septum surface point cloud from EM pointer to patient's forehead coordinate system.
- Combine nasal bridge point cloud and septum surface point cloud to generate a convex hull or mesh that will approximate the shape of cartilage.
===== Line of Cut Prediction (minimum deliverable)====
Reconstructed septum surface, scissor model training, and EM sensors data on both patient head and the surgical scissor are the variables we need for predicting the line of cut on septum surface.
=== Summary of the Procedure ===
- Estimate the septum surface plane parameters;
- Load current scissor parameters;
- Collect the scissor and septum pose sensor data;
- Generate a line from scissor cutting plane and septum plane intersection;
- Project the scissor pinch point to the generated line to get the initial cutting point;
- Calculate the distance between scissor pinch point and initial cutting point to get the maximum line of cut length.
- Calculate the end cutting point from the initial cutting point, maximum line of cut length, and generated line direction.
===== Server/Client Communication ====
Our project consists of three main modules, SeptoServer (provided to us by our mentor), which processes sensor pose information from the Aurora sensor, Platform.cpp, which receives the data from SeptoServer, and gives it to the LineOfCutGenerator and SeptumSurfaceGenerator depending on the operating mode, and the Python visualization module that uses vtk to visualize the surgical scissors, line of cut, and the septum plane.
The structure of the data formats of the packets that are sent between applications is in the final report.
===== Data Visualization ====
The Data Visualization Module is written in Vtk. At the beginning of operation, the surgeon will trace the septum plane, and Platform.cpp will send the surface model to the visualization module (using the data format described in a document in the previous section). The Visualization Module receives data packets (described in the previous section) that contain the line of cut and the scissor pose.
The Data Visualization Module displays the surgical scissors, and adjusts the pose when it receives a line of cut packet that contains the scissor pose.
When the valid flag is 1, the line of cut will be displayed.
======Dependencies======
* EM Tracker and EM Control Unit -- already available
* Pointer tool for surface reconstruction -- already available
* Access to laboratory environment -- already provided
* Access to rapid prototyping machinery -- will be provided by mentor Not needed.
* EM Tracker holder -- Expected arrival in March. We have a 3d-Printed prototype curently available that we can use until it arrives Already provided
* Surgical Scissors -- already available
* Learning CISST library for variety of application (pivot calibration, 2D-3D registration, visualization, etc.). We only use CISST to communicate with the EM sensor, and currently uses other library for dealing with matrix manipulation, visualization, and point cloud processing.
* Code for Communicating with EM Tracker and reading pose in realtime -- Has been provided
* Phantom for septal plane -- may use raw chicken, can be purchased with mentor’s funds Not needed. We use cardboard for training
======Milestones and Status ======
- Milestone name: Data Pipeline between sensor and PC
* Planned Date: March 4th
* Expected Date: March 25th
* Status: Completed
- Milestone name: Developed software for scissor model training
* Planned Date: March 23rd
* Expected Date: April 3rd
* Status: Completed
- Milestone name: Developed software for line of cut prediction
* Planned Date: March 23rd
* Expected Date: April 3rd
* Status: Completed
- Milestone name: Visualize
- Plot Scissors
- Plot Line
- Plot 3d Mesh (scan Pointer)
- Plot scissor edge (track edge)
- add pan / zoom function
* Planned Date: March 30th
* Expected Date: March 30th
* Status: Completed.
- Milestone name: Developed simple GUI for the software
* Planned Date: April 3th
* Expected Date: April 10th
* Status: Completed
- Milestone name: HTTP based communication for sending sensor data across the network Decided to use data piping instead.
* Planned Date: April 3th
* Expected Date: April 10th
* Status: Completed
- Milestone name: Performance Test
* Planned Date: April 6th
* Expected Date: April 6th
* Status: Completed
- Milestone name: Evaluation for line of cut prediction accuracy
* Planned Date: April 13th
* Expected Date: April 20th
* Status: Completed
- Milestone name: Report
* Planned Date: April 27th
* Expected Date: April 27th
* Status: Completed
======Reports and presentations======
* Project Plan
* {{:courses:446:2016:446-2016-03:project_planpresentation.pptx| Project plan presentation}}
* {{:courses:446:2016:446-2016-03:project_proposal.pdf|Project plan proposal}}
* Project Background Reading
* Ahmidi, N., Poddar, P., Jones, J. D., Vedula, S. S., Ishii, L., Hager, G. D., & Ishii, M. (2015). Automated objective surgical skill assessment in the operating room from unstructured tool motion in septoplasty. Int J CARS International Journal of Computer Assisted Radiology and Surgery.
* Radley, G. J., Sama, A., Watson, J., & Harris, R. A. (2009). Characterization, quantification, and replication of human sinus bone for surgery simulation phantoms. Proceedings of the Institution of Mechanical Engineers, Part H: Journal of Engineering in Medicine, 223(7), 875-887.
* Fong, Y., Giulianotti, P. C., Lewis, J., Koerkamp, B. G., & Reiner, T. (2015). Imaging and visualization in the modern operating room: A comprehensive guide for physicians. 17-27, 121-132, 181-191
* D’Ascanio, L., & Manzini, M. (2009). Quick Septoplasty: Surgical Technique and Learning Curve. Aesth Plast Surg Aesthetic Plastic Surgery, 33(6), 814-818.
* Project Checkpoint
* {{:courses:446:2016:446-2016-03:project_checkin.pptx | Project Checkin 3/22/2016}}
* Paper Seminar Presentations
* {{:courses:446:2016:446-2016-03:norris_seminar.pptx | Michael Norris Paper Seminar Presentation 4/19/2016}}
* {{:courses:446:2016:446-2016-03:norris_paper_2.pdf | Michael Norris Paper Seminar Paper 4/19/2016}}
* {{:courses:446:2016:446-2016-03:felix_jonathan_paper_seminar_presentation.pdf| Felix Jonathan Paper Seminar Presentation 4/28/2016}}
* {{:courses:446:2016:446-2016-03:felix_jonathan_paper_seminar_paper.pdf| Felix Jonathan Paper Seminar Paper 4/28/2016}}
* [[http://www.vision.jhu.edu/assets/TaoIPCAI12.pdf]]
* Project Final Presentation
* {{:courses:446:2016:446-2016-03:poster_violins.pdf| Poster}}
* Project Final Report
* {{:courses:446:2016:446-2016-03:final-paper2.pdf|Final Report}}
* {{:courses:446:2016:446-2016-03:softwaredocumentation.pdf|Software Documentation}}
* {{:courses:446:2016:446-2016-03:packet-routing.pdf|Packet Format (Inter-process Communication)}}
======Project Bibliography=======
* Ahmidi, N., Poddar, P., Jones, J. D., Vedula, S. S., Ishii, L., Hager, G. D., & Ishii, M. (2015). Automated objective surgical skill assessment in the operating room from unstructured tool motion in septoplasty. Int J CARS International Journal of Computer Assisted Radiology and Surgery.
* Radley, G. J., Sama, A., Watson, J., & Harris, R. A. (2009). Characterization, quantification, and replication of human sinus bone for surgery simulation phantoms. Proceedings of the Institution of Mechanical Engineers, Part H: Journal of Engineering in Medicine, 223(7), 875-887.
* Fong, Y., Giulianotti, P. C., Lewis, J., Koerkamp, B. G., & Reiner, T. (2015). Imaging and visualization in the modern operating room: A comprehensive guide for physicians. 17-27, 121-132, 181-191
* D’Ascanio, L., & Manzini, M. (2009). Quick Septoplasty: Surgical Technique and Learning Curve. Aesth Plast Surg Aesthetic Plastic Surgery, 33(6), 814-818.
======Other Resources and Project Files======
Here give list of other project files (e.g., source code) associated with the project. If these are online give a link to an appropriate external repository or to uploaded media files under this name space.
{{:courses:446:2016:446-2016-03:datamodels.png?200|}}
{{:courses:446:2016:446-2016-03:softwarearchitecture.png?200|}}