MR Simulator for Neonatal Needle Thoracentesis Procedure

Jump ARCHES: Mixed Reality (MR) Approach to Create Cyber-Physical Training Simulation Environments for Neonatal Needle Thoracentesis Procedures

Principal Investigator: Dr. Avinash Gupta
Summary:

Needle thoracentesis stands as a frequently performed and crucial procedure in emergency settings, often saving lives. A successful execution involves accurately pinpointing the second or third intercostal space along the midclavicular line and inserting a needle until air is drawn into the syringe. Subsequently, air is released by turning the stopcock. However, physicians frequently struggle to locate the correct intercostal space, leading to complications such as pneumothorax, where air accumulates outside the lungs due to a chest wall perforation.

The project aims to:

  1. Integrate a curriculum developed by clinicians to instruct on the needle thoracentesis procedure using augmented reality (AR) or mixed reality (MR).
  2. Develop a step-by-step system that governs the delivery of instructions, ensuring adaptability to user preferences.
  3. Introduce an ArUco marker tracking feature enabling scenes to materialize within boundaries specified by the user.

Unity served as the foundation for software development, with the creation of a step system to govern the guided simulation program. This system is structured to trigger appropriate events upon button presses, resulting in scene transitions. The Mixed Reality Toolkit (MRTK) played a crucial role in constructing the mixed-reality interface for HoloLens 2. Within Unity, the XR Software Development Kit (SDK) provided a range of versatile XR development features, with MRTK development specifically utilizing the OpenXR plugin. Integration of the OpenCV plugin in Unity facilitated the implementation of ArUco marker tracking, currently compatible with native or external webcams. The software was ultimately deployed to the HoloLens 2 head-mounted display (HMD).

We have crafted a prototype for the needle thoracentesis (NT) procedure within a mixed-reality setting. This guided simulator allows learners to progress at their own pace and offers flexibility for visual, auditory, or combined learning approaches.

Developing a guided simulator prototype in mixed reality represents just the initial step towards revolutionizing neonatal care provision. The step system, adaptable user interface, and iterative development process employed in this simulation can be replicated across other projects. With the dynamic flow of the step system, animations and dialogue can be effortlessly interchanged. Our goal is to create mixed-reality simulations for various healthcare procedures, ultimately enhancing patient safety and mitigating training challenges encountered by providers.

Sponsor: