Last updated: May 5, 2021
Skin biopsies are used by dermatologists to diagnose cutaneous ailments, including tumors and rashes. However, if a surgery becomes necessary after a biopsy, determining the original site of the biopsy can be difficult due to various factors including the skin healing, biopsy depth, and background skin disease. This difficulty can lead to wrong site surgery, which is a never event - an error that is preventable and should never occur.
This project aims to create a mobile augmented reality application (to be deployed on a phone or tablet) that can register biopsy images to surgery images and subsequently overlay the biopsy site on live camera images taken by the mobile device. This would provide dermatologists with guidance sufficient to locate the biopsy site on the patient at the time of surgery.
Dermatologists commonly use photography to locate biopsy sites, but this is prone to human error and could lead to wrong-site surgery or having to redo the biopsy. Others have attempted to address this need using various methods and tools, including a UV-fluorescent tattoo [2][3], a transparent grid [4], confocal microscopy [5], and “selfies” [6][7][8][9].
However, none of these have been incorporated into general practice yet, possibly due to cost, insufficient reliability, or excessive disruption to the typical workflow. Our specific aim is to create a mobile augmented reality application (to be deployed on a phone or tablet) that can register biopsy images to surgery images and subsequently overlay the biopsy site on live camera images taken by the mobile device.
Augmented reality and facial recognition have been used in conjunction to triangulate biopsy site locations on static photos [10]. However, this does not provide a live image overlay and is only effective for biopsies on the face; we intend to create an augmented reality overlay that not only provides live guidance, but also one that is effective for biopsies at any other location on the patient's skin.
If we are successful, the mobile application could be used by dermatologists to improve the accuracy of biopsy site localization, improving outcomes and reducing risks for the patient.
Our intention is to create an application with the following UI workflow:
At the time of biopsy, the procedure does not change: the dermatologist will take two 2D color photos of the biopsy site, one close up and one at some distance so as to capture anatomical landmarks.
When the patient comes in for surgery, the dermatologist will import the biopsy image from their photo library on their mobile device. They will also place computer vision tracking markers on the patient near the biopsy site.
Then, the application will provide an edge overlay using the biopsy photo in order to assist in taking the surgery photo, so that the two images can be as similar as possible. The user will then manually label the biopsy site and anatomical features.
After that, the software will internally register the biopsy site to the markers and then overlay the biopsy site on the live camera feed.
Broadly speaking, our application has three parts: the registration algorithm, the live marker tracking, and the mobile augmented reality application.
Registration Algorithm
Ruby will implement the registration algorithm using Python on Windows 10 with OpenCV packages. This can be prototyped with GRIP, an application typically used for rapid prototyping of computer vision algorithms.
The program will input user clicks as pixel coordinates in both biopsy and surgery photos for the biopsy site and tracking points. If the surgery and biopsy site photos are sufficiently similar, labeling on only one photo may be sufficient to reduce human inconsistency.
Feature detection, possibly corner detection, can be implemented to find precise tracking points near the input points. The program will then find a 2D-to-2D homographic transformation and create a circle or dot at the predicted biopsy site.
To test this, we can start by registering a biopsy photo to itself to check that the marked position is the same as the actual biopsy site position. Then, we can move on to testing our algorithm with proper photo pairs at various locations on the patient.
Live Marker Tracking
For live marker tracking, we have decided to use colored stickers as markers, which will be placed near the presumed biopsy location. This will also be implemented with OpenCV packages.
The markers can be found using hue/saturation/value thresholding, and then their contours can be found and filtered so that they can be used to find the marker centroids. These centroid points will be used to calculate the 2D transformation of the biopsy site for each frame.
We can also calibrate for different lighting conditions - the dermatologist should take a picture from the live feed and select a marker, and the pixel color of the marker will be used to adjust the HSV threshold.
Application Development
For an XCode approach, we can create a Swift or Objective-C application with CocoaPods OpenCV dependency, using XCode storyboards and CocoaTouch for the UI layout. OpenCV also has an iOS library that we can use for live AR tracking and overlay within the app.
Alternatively, we may look into Unity, which is better for cross-platform development. XCode only works on Mac, unfortunately.
For integrating the mobile application with the registration and live marker tracking algorithms, we intend to use data structures that can be imported and exported from independent code, such as JSON or YAML. The data structure may contain information on the points such as the center and radius, or just the point.
Registration and Tracking (Ruby)
Application Development (Liam)
Here give list of other project files (e.g., source code) associated with the project. If these are online give a link to an appropriate external repository or to uploaded media files under this name space (2021-12).