We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress
Sign In
Advertise with Us
GLOBETECH PUBLISHING LLC

Download Mobile App




Researchers Use AI to Improve Mammogram Interpretation

By MedImaging International staff writers
Posted on 04 Jul 2018
Print article
Image: Researchers used AI to improve mammogram image interpretation (Photo courtesy of the Department of Energy’s Oak Ridge National Laboratory).
Image: Researchers used AI to improve mammogram image interpretation (Photo courtesy of the Department of Energy’s Oak Ridge National Laboratory).
A team of researchers at the Department of Energy’s Oak Ridge National Laboratory (Oak Ridge, TN, USA) successfully used artificial intelligence to improve understanding of the cognitive processes involved in image interpretation. Their work, which was published in the Journal of Medical Imaging, will help reduce errors in the analyses of diagnostic images by health professionals and has the potential to improve health outcomes for women affected by breast cancer.

Early detection of breast cancer is critical for effective treatment, which requires accurate interpretation of a patient’s mammogram. The ORNL-led team of researchers found that analyses of mammograms by radiologists were significantly influenced by context bias, or the radiologist’s previous diagnostic experiences. New radiology trainees were most susceptible to the phenomenon, although even more experienced radiologists fall victim to some degree, according to the researchers.

The researchers designed an experiment aimed at following the eye movements of radiologists at various skill levels to better understand the context bias involved in their individual interpretations of the images. The experiment followed the eye movements of three board certified radiologists and seven radiology residents as they analyzed 100 mammographic studies from the University of South Florida’s Digital Database for Screening Mammography. The 400 images, representing a mix of cancer, no cancer, and cases that mimicked cancer but were benign, were specifically selected to cover a range of cases similar to that found in a clinical setting.

The participants, who were grouped by levels of experience and had no prior knowledge of what was contained in the individual X-rays, were outfitted with a head-mounted eye-tracking device designed to record their “raw gaze data,” which characterized their overall visual behavior. The study also recorded the participants’ diagnostic decisions via the location of suspicious findings along with their characteristics according to the BI-RADS lexicon, the radiologists’ reporting scheme for mammograms. By computing a measure known as a fractal dimension on the individual participants’ scan path (map of eye movements) and performing a series of statistical calculations, the researchers were able to discern how the eye movements of the participants differed from mammogram to mammogram. They also calculated the deviation in the context of the different image categories, such as images that show cancer and those that may be easier or more difficult to decipher.

In order to effectively track the participants’ eye movements, the researchers had to employ real-time sensor data, which logs nearly every movement of the participants’ eyes. However, with 10 observers interpreting 100 cases, the data soon began adding up, making it impractical to manage such a data-intensive task manually. This made the researchers turn to artificial intelligence to help them efficiently and effectively make sense of the results. Using ORNL’s Titan supercomputer, the researchers were able to rapidly train the deep learning models required to make sense of the large datasets. While similar studies in the past have used aggregation methods to make sense of the enormous data sets, the team of researchers at ORNL processed the full data sequence, a critical task as over time this sequence revealed differentiations in the eye paths of the participants as they analyzed the various mammograms.

In a related paper published in the Journal of Human Performance in Extreme Environments, the researchers demonstrated how convolutional neural networks, a type of artificial intelligence commonly applied to the analysis of images, significantly outperformed other methods, such as deep neural networks and deep belief networks, in parsing the eye tracking data and, by extension, validating the experiment as a means to measure context bias. Furthermore, while the experiment focused on radiology, the resulting data drove home the need for “intelligent interfaces and decision support systems” to assist human performance across a range of complex tasks including air-traffic control and battlefield management.

While machines are unlikely to replace radiologists (or other humans involved in rapid, high-impact decision-making) any time soon, they do hold enormous potential to assist health professionals and other decision makers in reducing errors due to phenomena such as context bias, according to Gina Tourassi, team lead and director of ORNL’s Health Data Science Institute. “These findings will be critical in the future training of medical professionals to reduce errors in the interpretations of diagnostic imaging. These studies will inform human/computer interactions, going forward as we use artificial intelligence to augment and improve human performance,” said Tourassi.

Related Links:
Oak Ridge National Laboratory

Gold Member
Solid State Kv/Dose Multi-Sensor
AGMS-DM+
New
C-Arm with FPD
Digiscan V20 / V30
New
Ultrasound Software
UltraExtend NX
Dose Calibration Electrometer
PC Electrometer

Print article

Channels

Radiography

view channel
:	Image: The AI model could be a valuable adjunct to human radiologists in breast cancer diagnoses and risk prediction (Photo courtesy of 123RF)

AI Model Predicts 5-Year Breast Cancer Risk from Mammograms

Approximately 13% of U.S. women, or one in every eight, are predicted to develop invasive breast cancer over their lifetime, with 1 in 39 women (3%) succumbing to the illness, according to the American... Read more

Nuclear Medicine

view channel
Image: The AI system uses scintigraphy imaging for early diagnosis of cardiac amyloidosis (Photo courtesy of 123RF)

AI System Automatically and Reliably Detects Cardiac Amyloidosis Using Scintigraphy Imaging

Cardiac amyloidosis, a condition characterized by the buildup of abnormal protein deposits (amyloids) in the heart muscle, severely affects heart function and can lead to heart failure or death without... Read more

General/Advanced Imaging

view channel
Image: The CIARTIC Move self-driving mobile C-arm has received FDA clearance (Photo courtesy of Siemens)

Self-Driving Mobile C-Arm Reduces Imaging Time during Surgery

Intraoperative imaging faces significant challenges due to staff shortages and the high demands placed on surgical teams in the operating room (OR). A common challenge during many OR procedures is the... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2024 Globetech Media. All rights reserved.