In radiation therapy, precision can save lives. Oncologists must carefully map the size and location of a tumor before delivering high-dose radiation to destroy cancer cells while sparing healthy tissue. But this process, called tumor segmentation, is still done manually, takes time, varies between doctors — and can lead to critical tumor areas being overlooked.
Now, a team of Northwestern Medicine scientists has developed an AI tool called iSeg that not only matches doctors in accurately outlining lung tumors on CT scans but can also identify areas that some doctors may miss, reports a large new study.
Unlike earlier AI tools that focused on static images, iSeg is the first 3D deep learning tool shown to segment tumors as they move with each breath — a critical factor in planning radiation treatment, which half of all cancer patients in the U.S. receive during their illness.
“We’re one step closer to cancer treatments that are even more precise than any of us imagined just a decade ago,” said senior author Dr. Mohamed Abazeed, chair and professor of radiation oncology at Northwestern University Feinberg School of Medicine.
“The goal of this technology is to give our doctors better tools,” added Abazeed, who leads a research team developing data-driven tools to personalize and improve cancer treatment and is a member of the Robert H. Lurie Comprehensive Cancer Center of Northwestern University.
The study was published today (June 30) in the journal npj Precision Oncology.
How iSeg was built and tested
The Northwestern scientists trained iSeg using CT scans and doctor-drawn tumor outlines from hundreds of lung cancer patients treated at nine clinics within the Northwestern Medicine and Cleveland Clinic health systems. That’s far beyond the small, single-hospital datasets used in many past studies.
After training, the AI was tested on patient scans it hadn’t seen before. Its tumor outlines were then compared to those drawn by physicians. The study found that iSeg consistently matched expert outlines across hospitals and scan types. It also flagged additional areas that some doctors missed — and those missed areas were linked to worse outcomes if left untreated. This suggests iSeg may help catch high-risk regions that often go unnoticed.
“Accurate tumor targeting is the foundation of safe and effective radiation therapy, where even small errors in targeting can impact tumor control or cause unnecessary toxicity,” Abazeed said.
“By automating and standardizing tumor contouring, our AI tool can help reduce delays, ensure fairness across hospitals and potentially identify areas that doctors might miss — ultimately improving patient care and clinical outcomes,” added first author Sagnik Sarkar, a senior research technologist at Feinberg who holds a Master of Science in artificial intelligence from Northwestern.
Clinical deployment possible ‘within a couple years’
The research team is now testing iSeg in clinical settings, comparing its performance to physicians in real time. They are also integrating features like user feedback and working to expand the technology to other tumor types, such as liver, brain and prostate cancers. The team also plans to adapt iSeg to other imaging methods, including MRI and PET scans.
“We envision this as a foundational tool that could standardize and enhance how tumors are targeted in radiation oncology, especially in settings where access to subspecialty expertise is limited,” said co- author Troy Teo, instructor of radiation oncology at Feinberg.
“This technology can help support more consistent care across institutions, and we believe clinical deployment could be possible within a couple of years,” Teo added.
This study is titled “Deep learning for automated, motion- resolved tumor segmentation in radiotherapy.”