Research Project
In my final research project at the University of Illinois at Chicago, I designed and developed an accessible and cost-effective ultrasound-guided liver biopsy simulator. Utilizing the inertial tracking capabilities of contemporary smartphones, this simulator provides a feasible avenue for introducing hands-on training in ultrasound procedures beginning learners, such as medical students. This marks the initial application of such methods in simulating an ultrasound-guided procedure.
Designing a Low-Cost Ultrasound-Guided Liver Biopsy Simulator Integrating Inertial Tracking Technology
Ultrasound-guided biopsies, which involve the precise insertion of a biopsy needle into a specific target within the body using real-time ultrasound images for guidance, are routine procedures in diagnostic radiology. While these procedures are typically taught through the use of simulators, existing models face the challenge of striking a balance between realism and affordability. Although high-fidelity simulators enhance learning outcomes, they often come with drawbacks such as bulkiness and high costs, thereby restricting their accessibility for training a large number of novice learners. Thus, there is a pressing need for a cost-effective, portable simulator with realistic human anatomy to introduce ultrasound-guided biopsy techniques to beginners.
In collaboration with the University of Illinois at Chicago's Department of Radiology in the College of Medicine, this project explored the integration of smartphone inertial tracking technology and virtual 3D anatomical models to develop a biopsy simulator. The prototype simulator enables real-time tracking of the virtual probe and needle using the smartphone's accelerometer and gyroscope, allowing trainees to control the virtual instruments in a manner simulating actual biopsy procedures.
The final deliverable comprises two components: 1) A desktop application that presents all necessary elements for an ultrasound-guided liver biopsy, including a 3D virtual human torso and liver, featuring a tumor buried within it, alongside virtual representations of an ultrasound probe and biopsy needle. 2) A smartphone application empowering trainees to pair their smartphones as controllers for either the virtual probe or the virtual needle.
3D Assets Creation
3D models of the torso and tumor mass were created using Pixologic ZBrush, while the 3D liver was created by segmenting CT data in Materialise Mimics followed by retopologizing and painting in Pixologic ZBrush. 3D instruments including the US probe and biopsy needles were created using Autodesk 3dsMax. These models were then imported to the Unity game engine and registered in place.
Modules and Functionalities
1. Tutorial Scene
Upon initiating the application, users are greeted with a tutorial animation designed to showcase how to control the movement of virtual instruments using a smartphone. Located in the upper left corner, a dropdown menu offers trainees a choice between two animated tutorials: one demonstrating horizontal movement and the other showcasing rotational control of the virtual instruments.
2. Controlling the Movement of the Virtual Probe
The live accelerometer and gyroscope inputs from the smartphone can be seamlessly transferred to the desktop application. This functionality empowers the trainee to manipulate the horizontal and rotational movement of the virtual probe by directly controlling the movement of the phone. Simultaneously, cross-sectional ultrasound images are digitally generated and positioned in the upper right corner for reference.
3. Controlling the Movement of the Virtual Needle
Likewise, the horizontal movement and rotation of the virtual biopsy needle can be precisely controlled through the smartphone's integrated accelerometer and gyroscope, fostering seamless interaction with the virtual environment.
4. Adjusting Phantom Transparency
Trainees can adjust the transparency of the virtual torso and liver by utilizing the sliders available on the control panel. This feature enables trainees to view the target tumor mass within the liver as a guide when they are uncertain about its position.
Procedure Simulation
As the final task, trainees will be prompted to capture a virtual tumor sample. Using the real-time cross-sectional ultrasound images located in the upper right corner as a reference, trainees will initially maneuver the virtual probe to locate the target tumor. Upon locating the tumor, trainees will insert the virtual needle until it reaches the target. Upon clicking the "Fire Needle" button located in the lower left corner of the screen, trainees will receive feedback regarding whether they successfully obtained the liver sample.
Traces of the biopsy needle can be viewed on the ultrasound image. The circle denotes the needle tip, while arrows indicate the trace of the needle.