News — Practice makes perfect, and a new system being tested and perfected that enables surgical trainees to obtain cutting-edge instruction in real-time, all through a new artificial intelligence program.
As medical students conduct surgical exercises, the AI software scans a live video feed and provides immediate, personalized feedback.The solution is among the first generation of AI teachers giving real-time feedback and may pioneer the use of similar instructional technology in other industries, including additional areas of healthcare and medicine.
The AI teacher is being developed by Associate Professor Usman Roshan in Ying Wu College of Computing’s Department of Data Science at New Jersey Institute of Technology, with colleagues from Robert Wood Johnson Medical School (RWJ) and , a software company focused on AI-powered human activity recognition products.
Roshan, collaborating with Dr. Advaith Bongu — a transplant surgeon and director of surgical simulation at RWJ — and AI engineer Yunzhe Xue from Robust AI, have been developing the platform since 2023 and are refining it for student training at RWJ, with an expectation of having it embedded to the curriculum in 2025.
The work has thus far been awarded a Rutgers Medical Educator Innovator award and is being presented at surgical meetings and conferences, including the .
According to Bongu, simulation has become an accepted part of the surgical educational curriculum. Surgical trainees develop laparoscopic skills over time and have to gain a Fundamentals of Laparoscopic Surgery (FLS) certification before graduation. These simulations, while cost effective and safe for patients, lack an automatic evaluation component and require constant supervision and manual feedback.
Laparoscopic surgery, also known as minimally invasive surgery, is performed through small incisions using specialized instruments and a tiny camera called a laparoscope.
Bongu adds, “Our focus has been on the first task of FLS, which has residents using graspers to transfer six rings from one set of pegs to another and then back again to the original set of pegs without dropping the peg with time constraints.”
The program is more than a computerized version of the game “Operation,” though.
The software designed and implemented by Xue uses an underlying single-pass object detection computer vision model, You Only Look Once, to detect the “surgery” and its components. It then determines whether a student has passed or failed the simulation and provides feedback on performance and specified training direction—all in real-time video that runs entirely on a laptop that doubles as the simulation monitor.
“The system studies the students’ actions as they perform them live and gives them an assessment that is immediate and automatic, with visual and audio cues in a scenario that feels virtually ‘alive,’” Roshan said.
Andrew Hu, a second-year surgical resident at RWJ, assisted with testing the system over the past year.
“We’re very excited about this project since it steps outside the bounds of the generative AI hype and into the domain of helping humans learn better. We expect broader usage of our software in surgical training programs nationwide and ultimately into other areas of human learning where physical activity is involved,” said Roshan.