BME CAPSTONE, SPRING 2019
MACHINE LEARNING GUIDED ULTRASOUND CANNULATION, 
SPONSORED BY US DEPARTMENT OF DEFENSE, SOF’20
E. CYNOR, M. FOLKERTS, L. SCHERGER, A. SOEWITO, R. UPADHYAYA

Where to begin! In late 2018 we were tasked with creating a way to address the unusabiliy of ultrasound for the purpose of guiding cannulation. This was an especially rewarding task because there were no bounds, so we ultimately had a lot of control over what type of solution we created. 

In January of 2019, I witnessed an open-heart surgery on an infant. I was working on a project for the Department of Defense to enable non-experts to use ultrasound to guide cannulation (the insertion of a needle into an artery to deliver medication). The larger mission is to improve accessibility and efficiency of battlefront emergency care.

Observing the surgery made me realize how difficult cannulation is, even for an expert in a controlled operating room. The doctor has to simultaneously hold the ultrasound steady with one hand, use the other to keep the skin tight, and use their “third hand” to insert the needle. On top of that, they have to constantly pan their vision back to the blurry ultrasound screen, which is used to locate the artery. After six failed cannulation attempts, the doctor successfully inserted the needle, but the image disappeared on screen, so they couldn’t tell how to angle the needle to prevent going too deep.

Three common problems were echoed throughout our interviews: blurry imaging, needle disappearance, and awkwardness handling the physical setup. To experience these, we designed a model to practice cannulating using chicken breast, balloons, gelatin, and straws. To address these three concerns, we created a three part solution: 



For these three solutions to actualize, we needed to develop models for data collection and testing because we couldn’t just cannulate ourselves (this would require sticking a needle into our own arteries, which the IRB wouldn’t be too stoked about). We went through several rounds of ideation until we optimized data collection to enable capturing hundreds of images of the cross sections of our own arteries using a handheld ultrasound. Inspired by a car phone mount, we attached the screen to the transducer to consolidate the field of view to one area and free a hand. The improved setup helped us take over 200 images of our model, which we annotated to teach ourselves how to recognize the artery and find the optimal angle for needle insertion. We then realized we could use the images to train an AI system to guide non-experts through cannulation. During this design process, I learned how helpful it is to not only observe the problem, but also experience it so that the solution is not based on assumptions, but instead through empathy.

My role in this project was focused around biomechanics, which included designing the hardware attachment, capturing vessel images and annotating them to build the algorithm, and interviewing physicians concerning cannulation. I also helped out by creating schematics and idea maps to illustrate our ideas for presentations. 

Despite having a lack of images to truly perfect the model, our solution, we did meet our objective, which was to improve the user’s experience of cannulating using an ultrasound imager. This was demonstrated during our final presentation, where our professors attempted a cannulation on an ultrasound model using our technology. 

The following images were part of my individual lab notebook. They outline some of the steps to prototyping, including brainstorms, solution synthesis and design analysis.



Solution synthesis mapping to address the three design objectives:
vessel identification, line detection for needle tracking, and image stabilizer


Decision making to address our technical and non-technical considerations


Phone stand ideation: physical design needs




Concept schematic and plans for physical prototype



Physical phone stand to enable easier mechanical stability 


Engineering and cost analysis to determine high-throughput manufacturing capability