We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress
Sign In
Advertise with Us
GLOBETECH PUBLISHING LLC

Download Mobile App




Artificial Intelligence Accurately Predicts Radiation Treatment Side Effects

By MedImaging International staff writers
Posted on 01 Oct 2019
Print article
Image: New research has shown that a computer model can predict side effects associated with radiation therapy (Photo courtesy of Technology Networks).
Image: New research has shown that a computer model can predict side effects associated with radiation therapy (Photo courtesy of Technology Networks).
Researchers from the University of Texas MD Anderson Cancer Center (Houston, Texas, USA) have demonstrated that a sophisticated computer model can accurately predict two of the most challenging side effects associated with radiation therapy for head and neck cancer. This precision oncology approach has the potential to better identify patients who might benefit from early interventions that could help prevent significant weight loss after treatment or reduce the need for feeding tube placement.

The team of researchers developed models to analyze large sets of data merged from three sources: electronic health records (Epic), an internal web-based charting tool (Brocade) and the record/verify system (Mosaiq). The data included more than 700 clinical and treatment variables for patients with head and neck cancer (75% male/25% female, with a median age of 62 years) who received more than 2,000 courses of radiation therapy (median dose 60 Gy) across five practice sites at MD Anderson from 2016 to 2018.

The researchers used the models to predict three endpoints: significant weight loss, feeding tube placement and unplanned hospitalizations. Results from the best-performing model were then validated against 225 subsequent consecutive radiation therapy treatments. Models with a performance rate that met a pre-specified threshold of area under the curve (AUC) of 0.70 or higher were considered clinically valid (an AUC score of 1.0 would mean the model’s predictions were 100% accurate, while a score of 0.0 would mean the predictions were never accurate). The models predicted the likelihood of significant weight loss (AUC = 0.751) and need for feeding tube placement (AUC = 0.755) with a high degree of accuracy.

“Being able to identify which patients are at greatest risk would allow radiation oncologists to take steps to prevent or mitigate these possible side effects,” said Jay Reddy, MD, PhD, an assistant professor of radiation oncology at The University of Texas MD Anderson Cancer Center and lead author on the study. “If the patient has an intermediate risk, and they might get through treatment without needing a feeding tube, we could take precautions such as setting them up with a nutritionist and providing them with nutritional supplements. If we know their risk for feeding tube placement is extremely high – a better than 50% chance they would need one – we could place it ahead of time so they wouldn’t have to be admitted to the hospital after treatment. We’d know to keep a closer eye on that patient.”

The machine learning approach cannot isolate the single-most predictive factor or combination of factors that lead to negative side effects, but can provide patients and their clinicians with a better understanding of what to expect during the course of treatment. In addition to predicting the likelihood of side effects, machine learning models could potentially predict which treatment plans would be most effective for different types of patients and allow for more personalized approaches to radiation oncology.

“Machine learning can make doctors more efficient and treatment safer by reducing the risk of error,” added Dr. Reddy. “It has the potential for influencing all aspects of radiation oncology today – anything where a computer can look at data and recognize a pattern.”

Related Links:
University of Texas MD Anderson Cancer Center

Gold Member
Solid State Kv/Dose Multi-Sensor
AGMS-DM+
Advanced Cardiac MRI Analysis Software
3Di Cardiac MR
MRI System
uMR 588
New
Ultrasound Doppler System
Doppler BT-200

Print article

Channels

Ultrasound

view channel
Image: Structure of the proposed transparent ultrasound transducer and its optical transmittance (Photo courtesy of POSTECH)

Ultrasensitive Broadband Transparent Ultrasound Transducer Enhances Medical Diagnosis

The ultrasound-photoacoustic dual-modal imaging system combines molecular imaging contrast with ultrasound imaging. It can display molecular and structural details inside the body in real time without... Read more

Nuclear Medicine

view channel
Image: PET/CT of a 60-year-old male patient with clinical suspicion of lung cancer (Photo courtesy of EJNMMI Physics)

Early 30-Minute Dynamic FDG-PET Acquisition Could Halve Lung Scan Times

F-18 FDG-PET scans are a way to look inside the body using a special dye, and these scans can be either static or dynamic. Static scans happen 60 minutes after the dye is administered into the body, showing... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2024 Globetech Media. All rights reserved.